One of the more interesting new features of the iPhone 5s is its fingerprint reader — especially because, up until now, fingerprint readers have not been an especially popular feature on laptops or other consumer devices. Tech history is littered with cool ideas like this that never achieved great popularity. It's not easy to say why the tablet took off, while people still prefer to talk on audio-only phones. Everyone could watch TV on a phone, but most prefer to watch streaming video from services like Netflix on their phones. Here are a few tech trends that looked like they were going to take off, but for whatever reason, never got off the runway.
First off, the iPhone 5s could be the fingerprint reader's breakthrough device, the product that finally makes this technology a mainstay instead of a quirky, optional feature. Up until now, though, the general public has shown very little interest in this security feature. IBM first released a consumer laptop with a fingerprint reader in 2004, but even though plenty of companies offer such a reader, it never really caught on as a make-or-break feature for a device. It's arguably easier to spoof a fingerprint scanner than to guess a good password, not to mention that most people don't keep anything that necessitates government-level security on their laptops.
This trend may seem like it actually did take off. After all, you can use your smartphone to communicate via FaceTime or Skype. Think about how often you use these services, though. For the vast majority of your phone calls, simply dialing the number and talking is enough — not to mention that you don't have to wear a shirt and you can pace back and forth like a caged lion. The truth is that videophones have been around since 1936, and available to the public since 1964. The devices were popularized by shows like "The Jetsons," and people assumed that as soon as videophones became affordable, there would be one in every home. As it turns out, people prefer privacy to observing body language.
MORE: 25 Best Windows 8 Apps
Near field communications (NFC) is actually not a bad idea, but there's no real demand for it. This functionality allows smartphones and other mobile devices to communicate with each other in close proximity via radio signals, rather than via Wi-Fi or mobile networks. Because NFC is difficult to hack, it provides a fast, secure way to pay for services (like parking meters) or exchange social networking information by tapping two phones together. The problem is that NFC is not replacing an arduous task with an easy one; it's replacing an easy task with a slightly easier one. Unless paying with cash or credit card, or typing someone's name into Facebook becomes significantly more difficult, NFC is probably destined to be a curiosity rather than a staple phone feature.
Like Spinal Tap, watching TV on your smartphone is really big in Japan. For whatever reason, however, the practice hasn't caught on in most other regions. Handheld antenna TVs were actually fairly popular back in the '90s, so transferring that functionality to smartphones seems like a no-brainer. However, the idea hasn't really taken off. To begin with, not many services stream broadcast TV, so it follows that there would not be many ways to watch such content on a smartphone. Aereo, one of only a handful of TV streaming services in the West, doesn't even have a dedicated mobile app. Streaming content from services like Netflix and Hulu is just easier and more convenient; would you rather tune in halfway through a show and deal with commercials or start and stop shows at your leisure?
The general public was still a little hesitant to fully embrace the Internet in 1999, but Microsoft thought that combining a browser with a television might allay some of their fears. WebTV, as the name suggested, allowed users to surf the Internet from their TVs by connecting a set-top box to a modem. Although WebTV did not tank outright, it also never gained a huge following, as people realized they were just fine separating their Internet and TV activities. Similar technology persists today: Smart TVs often incorporate Web browsers, and anyone with an Xbox 360, PS3 or Wii U can also bring Internet content to their living room screens. It's not so much that browsing on TVs never took off or died a sudden death, but it peaked at a very low height.
If not for a popular album by The Who, most people born after 1960 or so may never have heard the word "quadrophonic" at all. The premise behind quadrophonic sound is simple: Set up speakers at four corners of a room, and have each one broadcast on an independent channel. Put spectators in the middle, and let them experience a richer, more varied sound from their favorite musicians; Pink Floyd even gave a live concert using quadrophonic sound. Although these recordings sounded great, they were too far ahead of their time. Encoding quadrophonic sound on vinyl records proved difficult, and to replicate the setup at home was prohibitively expensive. Had it launched today, between digital audio and affordable home surround sound, quadrophonic sound could have stuck around. But instead, it's mostly disappeared.
Virtual reality (VR) is an odd beast. All the required component parts already exist — video glasses, realistic first-person games, noise-blocking headphones and motion controls — but no one can seem to make them all fit together in a way that the public wants. Technologists have been experimenting with virtual reality since the mid-19th century, working their way up from realistic art panoramas to driving simulators to glasses that could replicate a virtual environment in front of a user's eyes. Even so, VR systems today are generally expensive and unwieldy, and people seem perfectly content to experience games and simulations on a screen. Upcoming tech like Microsoft's IllumiRoom or the Oculus Rift could kick-start the VR trend again, but they don't have history on their side.