Virtual reality refers strictly to an artificial immersive environment generated by a computer. For example, the environment you're in when you put on one of those helmets and you navigate through a 3D computer-generated world.
Some people tend to call any Three Dee Interface a virtual reality. This is wrong. The proper terms for advanced non-immersive interfaces include Cyber Space and Alternate Reality User Interface.
Terminal eyestrain and fatigue problems
Unfortunately, virtual reality (especially of the augmented reality variety) will never become very popular. And the reason why is very simple. When you generate images for the human eye, it has to focus at the distance to the screen which the image is projected on or from. Except that 3D images make people's brains tell their eyes to focus way off of the proper focal distance. And that causes eyestrain and headaches. Virtual reality will founder on the dumb fact that it causes eyestrain.
That's like saying "reality will founder, because some things are far away and you have to strain to see them." That's where our eyes evolved, dude!
There are many causes of eyestrain and similar strain and fatigue from VR, but that particular one, of focal distance, is not inherent; the military has used heads-up displays with infinite focal length for decades. That method is most easily achieved via folded optical paths, which is bulky and expensive, but nothing in physics prevents better implementations (holographic lenses have been used in special circumstances, for instance).
Can you tell us how serious the fatigue problem is for VR, an idea of how many different causes of strain there are, and about what proportion of it is inherent to VR?
Three of the most serious are computationally-limited: foveal resolution, frame rate (temporal resolution), and latency of response (display lags behind e.g. turning head). These were sufficient to make VR impractical for many purposes that were supposed to be just around the corner in past years. I believe that Jaron Lanier's company was in fact derailed directly because of these problems. Moore's law has obviously helped with these - but still not enough to turn them into non-issues.
Related display hardware limitations: phosphor persistence, limited strobe rates of phosphors/LCDs/etc, which cause blurring and limit maximum frame rates
Frame rates in the ideal need to be enormous. There's a common misconception (e.g. in cynical comments in the game industry) that frame rates beyond the critical flicker fusion rate are useless, but that's not true, because for instance the eyes can track fast moving objects in the real world without blur, but in frame limited (e.g. 100fps) displays that isn't possible - the target is pre-blurred.
Small-scale off-focus. Even a small amount of blur, from any source whatsoever (lenses, computation, imaging array), causes increasing fatigue as the eyes attempt to find a non-existent accommodation. This can also cause long-term near-sightedness or far-sightedness, and so is possibly more serious than simple fatigue issues.
High or low contrast. Natural illumination varies over 7 orders of magnitude, and our visual system adjusts (iris, pupil, eyelids, rhodopsin/color pigment depletion/recovery). Computer displays of all sorts tend to have contrast ratios of more like 100 to 500, which comparatively speaking makes images more washed out and harder to resolve, which causes fatigue. Contrast is maximized in a darkened ambient environment, but bright images in a dark environment causes fatigue as well.
Color gamut. There is no display technology of any sort ever invented (including print) that can match the entire range of color vision; artificial displays have limited color gamuts (the most infamous example being the inability of color tv to display purple). This leads to artificiality which contributes to fatigue.
This is all just what immediately comes to mind; I think there are several more sources of fatigue as well, that aren't coming to mind.
It seems to me rather hard to guess whether all these technical issues might or might not ever be overcome, but we certainly aren't anywhere close today. I nonetheless think that limited use is of interest, e.g. Steve Mann's experiments with augmented reality, as opposed to totally immersive long time period VR, which is highly problematic. (Besides, Star Trek/Matrix style VR would require full body tactile input, which is obviously completely impossible today.) -- dm
Current applications
3D interfaces are sufficiently important in some niche applications to bite the bullet of eyestrain.
Current applications in Molecular Science: www.cnn.com
A variation is Augmented Reality where information about what you are looking at is superimposed on your field of view Real Time, typically with a small head mounted display. See www.sciam.com
Unfortunately, virtual reality (especially of the augmented reality variety) will never become very popular. And the reason why is very simple. When you generate images for the human eye, it has to focus at the distance to the screen which the image is projected on or from. Except that 3D images make people's brains tell their eyes to focus way off of the proper focal distance. And that causes eyestrain and headaches. Virtual reality will founder on the dumb fact that it causes eyestrain.
In theory, you could make screens that project light in proper directions to fake more distant point sources. It wouldn't be easy, but it isn't completely impossible like some other suggestions.
Lots of crap moved to Mind Reading
Also see: Virtual Reality Modeling Language, Real Virtuality
See original on c2.com