Skip to main content

The 15 best tech products of the decade

(Image credit: Tom's Guide)

Think back to 2009. The iPhone was still in its infancy. The fastest networks were 3G. We all used wired headphones. And no one had a clue what an Alexa was.

Fast forward to the end of 2019, and a lot has changed in the technology landscape.

Smartphones have replaced cameras. Wireless earbuds have become ubiquitous. We're streaming Disney Plus and Netflix right to the palm of our hands and our 4K TVs. And asking Alexa pretty much everything that comes into our heads has become second nature.

And smartwatches like the Apple Watch — something that did not exist before this decade — are saving lives with the heart-health data they provide on the fly.

But it really hit me that we're entering a new decade as I conducted an interview for this very feature. As I asked questions, the Pixel 4's Recorder app transcribed what my interviewee was saying in real time. No, it's not as cool as a flying car or jetpack, but it was a truly amazing for me I didn't think possible 10 years ago.

The past decade has certainly seen the darker side of technology and its impact, including huge data breaches, privacy concerns around location tracking and especially Facebook's several violations of public trust. But on balance, I'm optimistic about what lies ahead.

As we hurtle toward a new decade of innovation that includes a 5G explosion, self-driving cars and augmented-reality glasses, it's important to take a step back to recognize the products that defined the last ten years.

After much internal debate here at Tom's Guide, here are the 15 best tech products of the decade. 

Sony A7R II (2015)

(Image credit: Tom's Guide)

While Sony helped introduce the world to full-frame mirrorless cameras with the original A7R in 2013, Sony really nailed the design with its successor, the A7R II. Launched in 2015, the A7R II boasted a number of improvements that have become standard in the industry: a larger backside-illuminated 42.4MP sensor, in-body 5-axis image stabilization and full-width 4K video. All this in a camera that's much more compact than a typical DSLR. 

Other features that are just as good now as they were five years ago: An ISO range from 100-102400; a 399-point hybrid autofocus system; a 2.3-million-dot OLED electronic viewfinder (EVF), all housed in a weather-sealed magnesium alloy body. 

"Although backside-illuminated sensors (BSI) had been previously developed in much smaller smartphone image sensors, BSI in full-frame made it possible to enjoy high resolution and high sensitivity in a single camera – something which had not been previously available," said Mark Weir, senior manager of technology at Sony Electronics.

It would take Canon and Nikon another three years before they launched their own full-frame mirrorless cameras, and even against them, the A7R II still acquits itself well. 

"It speaks to the significance of the Sony A7R II that it's now two generations old and it's *still* an incredibly potent camera," said James Artaius, the editor-in-chief of Digital Camera World. "Compare most of the bodies on the market today with this mirrorless marvel from 2015 and it still packs a ton of features that are tough to match. The Sony A7R II took great pleasure in tearing up the rulebook nearly five years ago, and it remains one of the key chess pieces in Sony's spectacular gambit that saw it checkmate the supposed kings of full-frame imaging."

Since the release of the A7R II, Sony has launched two successors, the A7R III and the A7R IV, the latter of which has an even larger 61-MP sensor, a better autofocus system, faster shooting speeds and improved battery life. Even so, the A7R II remains a very potent camera, and now that’s it's less than $1,500, makes full-frame mirrorless photography even more accessible. — Mike Prospero

Xbox Game Pass (2017)

(Image credit: Tom's Guide)

The idea of a "Netflix for games" has been a mostly elusive one chased by many gaming companies throughout the decade to mixed results. PlayStation Now launched in 2014, but it largely existed to deliver older PS3 games via the cloud when the PS4 was already red-hot. That same year gave us EA Access: an all-you-can-play vault of downloadable games, but one that was limited to EA properties such as Battlefield, Madden and FIFA.

Then Xbox Game Pass came around in 2017, and redefined what a gaming subscription could be.

"Our fans were telling us they wanted an easily accessible library of games, and combined with the ongoing popularity of subscriptions and memberships, saw an opportunity to deliver on that," said Ben Decker, head of Gaming Services Marketing at Microsoft.

Game Pass started as an Xbox One-exclusive service, offering access to more than 100 Xbox One games (and backward-compatible Xbox 360 games) for a $10 monthly fee. The library started off humbly enough, with first-party titles such as Halo 5 sitting alongside older fare like Mad Max and Saints Row VI. 

Then, in 2018, Microsoft made the radical decision to offer all of its first party games on Xbox Game Pass on the day of release, meaning subscribers could instantly jump into titles such as Sea of Thieves and Gears 5 as soon as they launched. 

But Microsoft wasn't done. In 2019, the company debuted Xbox Game Pass for PC, a PC-centric version of the service that includes first-party games as well as PC-centric fare like Cities: Skylines and Into the Breach. Better yet, the company also introduced Xbox Game Pass Ultimate: a $15-per-month service that gets you Game Pass for Xbox, Game Pass for PC and the console perks of Xbox Live Gold. That's a whole ton of games across two platforms for a low monthly fee.

Game Pass isn't just a great value –– it's also opened doors for titles that may have not found an audience otherwise. Hit indie survival game Outer Wilds debuted on Game Pass to huge critical and fan praise in the summer of 2019, spreading with the same feverish word of mouth that a hot Netflix show does. According to an internal Microsoft survey, 91% of Game Pass members have played a game they wouldn't have otherwise tried without the service.

"There are some great games that require a base number of players for an optimal experience; without a huge marketing budget, these titles may struggle to ensure high visibility and adoption at launch," Decker said. "With Xbox Game Pass, titles such as Laser League and Blair Witch have the potential to reach millions of players at release, which helps contribute to the overall health of the game."

Perhaps most important, Xbox Game Pass already sets up Microsoft well for next year's console wars, when the Xbox Series X will go up against the PS5 in late 2020. With a huge library of backward-compatible games, eventual Project xCloud support and upcoming titles like Halo: Infinite on the roster, Game Pass seems poised to ensure that both current and future Xbox platforms are packed with great games to play for an accessible price. — Mike Andronico

LG OLED TV 55EA9800 (2013)

(Image credit: Tom's Guide)

LG's first OLED TV was sold in 2010, and other companies had made a handful of OLED TVs back as far as 2007, but it was the 2013 LG OLED 55EA9800 that really changed the game for the TV world. And change it has. OLED technology has become the yardstick by which we measure all displays by, thanks to its perfect black levels, infinite contrast and incredible slimness.

The 55EA9800 was the first OLED TV to land in the US, and it grabbed lots of attention back in 2013. At that time, comparisons were made to plasma displays – remember those? – but at $9,999, it was still a truly premium product. 

The 55-inch OLED TV was revolutionary. It was the first big-screen implementation of OLED technology, which had been limited to much smaller TVs and device displays in the years prior. While 55 inches wasn't the biggest TV option available, it was definitely the biggest OLED to date.

But the 55EA9800 wasn't impressive solely for its size. This TV was packed with features, including a number of faddish offerings that you won't find on many sets today, like a curved screen, a number of 3D options (glasses included) and stunning transparent speakers embedded in the TV's clear acrylic stand.

Although many of the 55EA9800's most attention-grabbing features have faded into history as the fads of 3D and curved TVs have fallen by the wayside, the central promise of LG's first big OLED TV still holds: This will look amazing. Six years later, and the current offerings from LG remain at the top of the TV pyramid. The LG C9 OLED is at the top of our best TV list, and the LG Signature Z9 88-inch 8K OLED pushes the humble OLED TV to dimensions and resolutions that were positively fantastical only six years ago.

"OLED has been a game changing technology for LG and a catalyst to drive innovation in the industry," said Stephen Baker, vice president of industry analysis at NPD Group. He points out how OLED's ascendancy has also spurred other industry players like Samsung to innovate to stay competitive. We may owe LG's OLED thanks for the explosion of quantum-dot technology used to make LCD sets look brighter, and with better color.

And LG's current fortunes owe a lot to that first OLED set. Since 2013 LG has sold more than 5 million units. That's still a relatively small portion of the overall TV market – it's roughly 1% of the 220 million TVs sold each year – but the OLED sets make up more than one-third of the premium TV market, where sets cost $2,500 or more.

Other display technologies threaten to overtake OLED, including micro LED, which is being embraced by Samsung. But OLED ruled this decade, and its reign will likely continue for some time. — Brian Westover

Roku 3 (2013)

(Image credit: Tom's Guide)

Streaming players have become so common in the past five years, we take them almost entirely for granted. It goes without saying that you can connect a device to either a 2.4 Ghz or 5 Ghz Wi-Fi network, then search across dozens of channels to find what you want to watch, then plug in your headphones and listen to your content without waking up the rest of the house. But up until the Roku 3 came out in 2013, these weren't common features. In fact, media players before then looked a little primitive by comparison.

"This was the first big introduction of our new user interface, and also the new introduction of our headphone jack," said Lloyd Klarke, director of product management at Roku. "The Roku 3 allowed you to very quickly search across channels, and within channels. The other thing it introduced that no other player had done before was the headphone jack for private listening."

Although the Roku 3 was arguably ahead of its time, customers embraced it right away. "It's a five-star product," Klarke said. "Customers loved it. It was probably the best embodiment of Roku. Find your channels fast, find something to watch quickly and enjoy it."

Almost everything that viewers love about streaming gadgets in general, and Roku products in particular, debuted in the Roku 3. There was the customizable Roku OS, which let users arrange their favorite apps into convenient rows, in whichever order they wanted. There was dual-band Wi-Fi for faster buffering and more reliable video signal. But the biggest addition was the headphone jack in the remote control — still a staple feature in high-end Roku devices.

"Private listening took something we saw in the home that was a pain point and made it simple and easy to use," Klarke explained. "If you want to watch something and not disturb others, or someone else wants to watch something and you don't want to be disturbed, private listening solved that in the home. 

"No settings, no configuration, nothing to do except plug in a set of headphones, and Roku did the rest for you. WE automatically diverted the audio for you to your remote control; we took care of the audio-video sync. We made it as simple as plugging in a headphone jack."

In fact, when I asked what's next for Roku, Klarke was keen to highlight how many strides the company has made with audio tech recently. The Roku Wireless Speakers, Roku Smart Soundbar and Roku Subwoofer are a logical evolution of the headphone jack: they simplify streaming tech by marrying high-quality sound to an intuitive Roku interface.

There were good streaming players before the Roku 3, and even better ones afterward. But this one would set the standard going forward. — Marshall Honorof

Xbox Adaptive Controller (2018)

(Image credit: Tom's Guide)

The gamepad is a familiar game controller for players around the world, But because it requires two hands, all 10 fingers and full mobility, some players can’t use it. In response to this, Xbox revealed the Adaptive Controller to the world in 2018. It is the first officially made controller with fully customizable input methods, which lets users create a setup that perfectly meets their needs and play games as freely as the rest of the world.

"Our main goal when creating the Xbox Adaptive Controller was to design a product that was easy to use, affordable and readily available," said Gabi Michel, senior program manager at Microsoft. "We set out to empower gamers around the world and give them the opportunity to play the games they want, with the people they want, on the devices they want."

The Adaptive Controller began life as a project created during an internal Microsoft hackathon in the spring of 2015, inspired by stories of military veterans who struggled to use the standard Xbox controller, as well as building on the prior work of gaming accessibility interest groups. While there have been controller options available for gamers unable to use a standard controller fully, these were not mainstream products. "Frequently, custom rigs cost just as much as, if not more than a gaming console, and typically require technical expertise to build," Michel said.

It took three years of work and collaboration with experts and gamers before the Adaptive Controller as we know it was ready for users to buy. 

After its launch (on Global Accessibility Awareness Day), the Adaptive Controller was lauded for its design philosophy, and how well it was executed. The team is still continuing to work on the project though. "We’re passionate about accessibility and are excited to continue on this journey," Michel said. "We haven’t reached our end destination."

Microsoft recently announced the new Xbox, the console that will take over from the Xbox One and take on the PlayStation 5. When I asked about the future of the Adaptive Controller, Michel said that "we [will] continue our commitment to compatibility." — Richard Priday

Galaxy Note (2011)

(Image credit: Tom's Guide)

With a 5.3-inch screen, the original Galaxy Note seems puny compared with today's mega phablets, but it absolutely dwarfed the iPhone 4s back in October 2011. This was a big gamble on Samsung's part, and it ultimately pioneered an entire category of big-screen phones — despite many critics who initially poked fun of the Note's imposing size.

"We knew that what we created was good when we put it in the hands of the first Note consumers…because they could do things and accomplish things that they can't anywhere else," said Suzanne De Silva, head of mobile product management and marketing at Samsung. 

Those things included delivering a lot more web pages without having to scroll, a more immersive viewing experience when watching video, thanks to a colorful HD Super AMOLED display, and Samsung's new S Pen, which let users do everything from sign PDFs to jotting down ideas on the go.

The Galaxy Note evolved during the 2010s and debuted several more innovations as the screen grew, including Multi Window multitasking with the Galaxy Note II (5.5 inches), the Galaxy Note Edge with curved display (5.6 inches) and a screen-off memo feature and clickable auto-eject S Pen button with the Galaxy Note 5 (5.7 inches).

The Note line took a near-tragic turn in 2016 with the Galaxy Note 7, which had to be recalled due to a number of fires and faulty batteries. But Samsung roared back with the Note 8, which delivered a whopping 6.3-inch display along with a telephoto camera with Live Focus mode. Today'.s Galaxy Note lineup gives users two options: large (6.3 inches) in the Galaxy Note 10 and extra large with the Note 10 Plus (6.8 inches). But both devices are remarkably compact given their screen sizes.

The Galaxy Note franchise has been so successful in taking big-screen phones mainstream that there are rumors that Samsung could sunset the brand. But Samsung has learned that the Note customer has very specific needs compared with the Galaxy S line.

"When we ask these consumers, 'What do you prioritize as the No. 1 thing you're looking for when you purchase your device?' said De Silva," time and time again, the Note consumer will say it is the S Pen. It is the productivity, the creativity. A lot of it has to do with not just what the Note brand has stood for over the decade, but also how we bring it together with the hardware and software." — Mark Spoonauer

Sling TV (2015)

(Image credit: Tom's Guide)

By 2015, streaming services like Netflix, Hulu and Amazon Video had successfully demonstrated that viewers didn't need expensive cable or satellite subscriptions to enjoy high-quality television and movies. However, live TV was still a bit of a problem. While an HD antenna could pick up some of the slack for local channels, shows like The Walking Dead, Better Call Saul and Mad Men remained locked behind cable packages that could top $100 per month.

Enter Sling TV: a streamlined subscription service from satellite TV provider Dish. This service promised a relatively bare-bones cable replacement package, with only 14 channels at launch. But here's the kicker: The channels were generally the ones that viewers wanted most, and the service cost only $20 overall.

For the first time, users could live-stream paid channels without locking themselves into an expensive, complicated contract with a cable or satellite provider. Sling TV offered AMC, ESPN, the Food Network, Cartoon Network, the Disney Channel, AMC and El Rey, ensuring that there was a little something for everyone.

However, 14 channels and a low introductory price could take a service only so far. The really clever thing that Sling TV did was offer users à la carte channel packages for reduced prices. You could buy a handful of extra channels for an additional $5 to 15 fee, depending on what you wanted to watch. There were movie channel bundles, sports bundles, kids bundles and even foreign-language bundles. Twenty dollars wouldn't buy you absolutely everything you wanted to watch — but maybe $25 or $30 would, without an additional $70 wasted on stuff you'd never want to watch.

Over the years, Sling TV has become more robust, more expensive and a little more confusing. The service now starts at $25, and is divided into Orange and Blue tiers, depending on what kind of programming you want. 

It's also no longer the only cable-replacement service in town. Since Sling TV debuted, it's been joined by the likes of Hulu with Live TV, AT&T TV, YouTube TV and PlayStation Vue (which is on the way out). However, Sling deserves credit for having a better interface and pricing structure than most of its competitors — and simply outlasting the others.

The TV market is in the process of fracturing again, as individual networks try to carve out their own content and sell users on boutique services, such as CBS All Access and the upcoming HBO Max. Will Sling TV fall by the wayside as users pick and choose their favorites — or become more valuable than ever by concatenating a variety of channels? We'll know for sure in a few years. — Marshall Honorof 

Dell XPS 13 (2016)

(Image credit: Tom's Guide)

For a long time, everyone in the laptop market was trying to keep up with Steve Jobs and Jonny Ive. For years, Apple'[s MacBook Air was the best laptop, combining a heady mix of portability and power. But in 2015, Dell emerged from the crowded pack of MacBook clones and never looked back. It was lightweight. It was powerful. And it was nearly bezel-free. It was the Dell XPS 13, and the start of a new age. 

The XPS brand had been around since 1994 in many different variations. But the modern version of the XPS 13 we all know and love, came on the heels of the 2010 Dell Adamo XPS' failure. But according to Donnie Oliphant, senior director of product marketing for XPS, "that failed brand attempt for us was probably the best thing that could have ever happened for the XPS brand." It set the stage for a total brand relaunch that started with the 2012 XPS 13, then code-named Spider. With a mission of "delivering the absolute best products...within the consumer space," Dell continued to tweak the design with the target firmly set on Apple's consumer base — the premium market.

It was a feat that seemed impossible; Dell essentially had to out-Apple Apple. But at CES 2015, Dell caught lightning in a bottle, unveiling the first Dell XPS 13 with the InfinityEdge display. Formerly code-named Dino, the 2015 13-inch was a jaw-dropping beauty that claimed the title of the world's smallest 13-inch notebook. By eliminating the thick bezels, Dell shrank the XPS 13's profile by 23%. 

InfinityEdge wasn’t developed in a bubble. After receiving a challenge from the leadership to put a 13-inch platform into an 11-inch frame, the XPS team teamed with Sharp to create the innovative panel. It combined Dell's desire for a super-aggressive, close to zero bezel panel with Sharp's energy-saving EXO technology, which allows the panel to run higher resolutions more efficiently. After the first panel, Dell refined the color gamut, tweaking brightness and improving the viewing angles. 

"[T]hat's where Infinity Edge was derived from," Oiliphant said. "It basically was no compromise, the most aggressive mechanical packaging we could put together. And then, let's bundle this with a group of other technologies that would really differentiate our product."

However, that new svelte figure and eye-popping display didn't come without a noticeable compromise. The webcam was moved from the top bezel into the bottom, which for a few years was derisively referred to as a "nosecam." Funnily enough, several laptop manufacturers copied Dell's polarizing design choice. Not shying away from criticism, Dell used it to create a 2-millimeter webcam, making it the world's smallest.

"It was a painful few years, but we did get past it," Oliphant said, "and we appreciate the customers who stuck with us during that trying time for those that didn't appreciate the nose cam."

By learning from its mistakes and never resting on its laurels, Dell has become an undeniable trendsetter. Bezel-free laptops are now so common, the trend has made its way to gaming laptops. Dell has established a dynasty five years strong, and shows no signs of slowing down. — Sherri L. Smith

iPad (2010)

(Image credit: Tom's Guide)

The iPad bookended the 2010s with its original model launching on April 3, 2010, and a trio of iPads arriving in 2019 following an excellent iPad Pro update in 2018. Originally criticized for just being a bigger iPhone, the iPad has thrived (while most Android tablets died) by both diverging from Apple's phone and keeping what makes it great. 

While many Android tablets are priced at cheaper levels, the iPad won by being worth its price. Consumer tech analyst Avi Greengart of Techsponential has a pretty simple reason for why the iPad's won: "It has a much better user experience," which you can tell by using it. Unlike Amazon's more affordable Fire tablets, which have always been kind of slow, the iPad, and especially its Safari web browser, have always been snappy.

And just like the iPhone, the iPad's popularity can be tied to the quality of its apps. Over the years, Greengart notes, the iPad's "apps have been in almost all cases, re-designed to take full advantage of the tablet's real estate, whereas most Android apps on tablets are blown up phone apps that don't have the right density of information." Making the most of a big screen also happens on the couch, as the iPad thrives in a space where it's OK that you've got something that's larger than a phone, but still want something more comfortable than a whole laptop.

With regard to their endurance, Greengart told me "Apple should also be commended for supporting its iPads and phones for longer than — I would say rivals, but there aren't many rivals to the iPad at this point. If you just think of Android phones and tablets, Apple updates its software for much longer than the competition, and that leads to consumers using them for longer periods of time."

While Apple would love for people to upgrade their devices frequently, one of my favorite things about iPads in general is how long they last. My parents love their 4th Generation iPad, which came out in 2012, and have been for years. 

And with the advent of iPadOS in 2019, Apple's tablet has become a bit more future-proof, with better multitasking tricks, mouse support and other power-user features. Apple’' continuing to position the iPad as a laptop replacement, and more of these features give the public reason to consider that argument (if they haven't bought in already). — Henry T. Casey

Apple Watch Series 4 (2018)

(Image credit: Tom's Guide)

When Apple released the first Apple Watch in spring 2015, reviews of the smartwatch were mixed. The seamless iPhone integration was useful for notifications, and there were a few neat features, including the ability to scribble quick iMessages. But the device wasn't really a must-have.

"My first reaction was: That’s it?" said Ramon Llamas, an IDC research director who covers wearables. "There were a lot of people, myself included, who thought this is the device that will raise the entire market. The mistake that I made is that I thought it would happen immediately."

With the Apple Watch Series 4 just three years later, Apple hit its smartwatch stride. Not only did Apple for the second time offer a cellular option for people who want a watch that can be used on its own without an iPhone, but the Series 4 also baked in a slew of health and fitness features, including two that were cleared by the U.S. Food and Drug Administration for diagnosing irregular heart rhythm. A fall-detection feature could sense a tumble and then alert emergency services.

People were already attributing the Apple Watch Series 3, which offered low and high heart rate alerts, with saving their lives. The Series 4's built-in electrical heart rate sensor, which can diagnose atrial fibrillation when paired with the watch's ECG app, cemented its status as an essential health device. Apple CEO Tim Cook regularly shares stories he's heard, on-stage at Apple events and on his Twitter account, from Apple Watch owners about the impact the watch has had on their lives.

"I think it's a rarity within the realm of technology that you can establish some sort of emotional connection to a device," Llamas said. "You take away my smartphone from me and I'm gonna feel kind of like a fish out of water. In the case of the watch, Apple's playing the angle of this as a lifesaver, and connecting the dots as to why."

Though other companies were working on smartwatches long before Apple, the Cupertino company now dominates the market. Apple's wearables division (which also includes AirPods) would be a Fortune 200 company if spun out on its own.

Those other companies are now playing catch-up: Samsung just put an electrocardiogram sensor in its newest smartwatch, the Galaxy Watch Active 2, and Google acquired Fitbit, presumably in a bid to compete with Apple. Cellular connectivity is no longer impossible to come by in a smartwatch. The Series 4 proved that smartwatches could do so much more than deliver notifications to your wrist, or even make phone calls. They can diagnose disease, making them so much more than the simple accessories they once were. — Caitlin McGarry

Instagram (2010)

(Image credit: Tom's Guide)

No app satiates our scrolling compulsions quite like Instagram. When the social media platform debuted in 2010, its signature filters and grid layouts didn't feel exclusive to photographers. With improving smartphone cameras, anyone could share snippets of their life in tiny squares.        

"At the time, Instagram gave us a dedicated channel to express and follow experiences and moments and thoughts of people in a simple, but compelling platform," said Brian Solis, independent digital analyst studying digital experience, innovation and disruption and best-selling author. "It was a simple visual medium when most social networks were text-based."

Instagram today looks different than it did 10 years ago. A modern white-and-black platform replaced the original blue-and-gray scheme. It has also evolved over time to keep its user base active, while luring new 'grammers with the key features from competing social media networks. 

"Instagram has embraced a fast-follow strategy, meaning it looks at trends from emerging rivals to mimic those capabilities," Solis said. "Like Stories from Snapchat, for example. It prevents people from leaving"”

Depending on your niches, Instagram feels personal and special. I'm able to absorb content from cheese plate artists, labrador owners and calligraphers that speak to my interests. But for some, the social network harvests a complicated culture of chasing likes.

"There's an underbelly of FOMO," Solis said. "Instagram inadvertently creates a sense of comparison economies. Over time, that erodes things like self-esteem and satisfaction."

In the last year, though, Instagram has been experimenting with hiding likes on posts. The move may trouble influencers or those who earn financial gain from their social engagement. But it may be what propels Instagram against fast-rising competitor TikTok as we begin the next decade.

"All eyes are on Instagram to emulate the TikTok experience to keep people from losing interest," Solis said. "Likes on TikTok are starting to erode the positive experience. In an accelerated period of time the optimism of the platform has faded." 

Instagram has managed to maintain its relevance for 10 years now, and there's no reason to suspect it's going anywhere anytime soon. — Kate Kozuch

AirPods (2016)

(Image credit: Tom's Guide)

Admit it, when you first saw the Apple AirPods, you giggled a little. And who could blame you? Apple's grand entry into the wireless earbuds looked like EarPods, but with the wires lopped off. The jokes were flying. But here I am writing about why the AirPods are one of the top products of the decade.

In the tech world, it's been said that a product category hasn't arrived until Apple takes a crack at it (see smartphones, tablets, smartwatches). And so it went for the truly wireless earbuds space. At the onset of the category, I had reviewed my fair share of strong products and vaporware, but when the first generation AirPods debuted, the market seemed to solidify, inspiring other heavy hitters to enter the industry.

Tim Bajarin, Creative Strategies analyst, said that Apple's entry into the space was all but inevitable. "It goes back to Steve [Jobs]'s, incredible love for music and the desire to have that music heard in the greatest quality possible."

But what is it about the AirPods that makes them so special? Apple's particular brand of je n'ais ce quoi lies in its simplicity. Pairing the AirPods with the iPhone was so simple it seemed like magic, thanks to Apple's revolutionary W1 chip. Plus, it didn't hurt that the AirPods actually delivered surprisingly good audio quality and 5 hours of battery life (24 hours with the charging case). It made the weird design less of an issue. 

"[O]ne of the things that we've understood in the tech sector for a long time," Bajarin said, "is that people will buy based on need as opposed to style the majority of the time...Apple has always designed for what they believe the customer needs and what the customer wants." 

Apple continued to fine-tune the formula with the 2nd Gen AirPods, maintaining the design, but doubling down on the functionality. The new, improved AirPods were outfitted with Apple's new H1 chip, making switching between devices even faster while letting wearers simply say "Hey, Siri" to chat with Apple's assistant.

But something was missing. Something consumers had been asking for since the first AirPods debuted -–– noise cancelling. Apple finally gave people what they wanted late this year with the AirPods Pro. Does it shut out all noise? No, but it's a massive improvement. In addition to ANC and Transparency modes, Apple finally added eartips to the mix to give listeners a secure, comfy fit. There's even a hearing test to make sure you're wearing the right set of tips. Apple was even kind enough to change up the design and make the buds a little less conspicuous with shorter stems.  

So what's next for the AirPods? Bajarin believes the ubiquitous buds will be the company's first foray into augmented reality as a complementary component to a pair of glasses. He speculates that the AirPods will evolve into some sort of lightweight bone-conduction earphones that will allow for a more immersive experience.

"[T]hat's why I believe that if you look at it, I actually think that part of the rationale behind building the AirPods in the first place was with a long-range vision that this would be connected via critical hardware component of their long-term AR strategy."

Wherever Apple takes the AirPods in the future, one thing is certain: The tiny white earbuds have made an indelible mark on the wearables landscape. — Sherri L. Smith

Nintendo Switch (2017)

(Image credit: Tom's Guide)

Nintendo didn’t exactly own the gaming conversation for most of the 2010s. The popular, family-friendly Wii was on its way out, and while 2011's Nintendo 3DS handheld thrived through a whole decade, Nintendo needed a home console contender. That ended up being 2012's Wii U: a confusing, gimmicky console that featured a giant tablet as its primary controller, and one that was already behind the looming PS4 and Xbox One in  power. 

It felt as if Nintendo's next generation of hardware would be a make-or-break one for the company. And when the Nintendo Switch arrived in March 2017, it was overwhelmingly the former. 

The concept is simple: a small tablet that can be used as a TV console when connected to a dock, or taken on the go as a handheld or tabletop machine thanks to a set of detachable Joy-Con controllers. You can binge on The Legend of Zelda: Breath of the Wild on your couch, undock the Switch, and pick up right where you left off on the subway. You can also stand the Switch up, slide off the Joy-Cons, and enjoy instant multiplayer in titles like Mario Kart 8 Deluxe.

However, even the most brilliant gaming hardware is useless without great games, and despite a relatively slow start, the Switch eventually amassed one of the most impressive libraries of all time

"One of the most important factors when launching the Nintendo Switch system was making sure we could ensure a strong lineup of launch software and a steady pace of software," said Charlie Scibetta, senior director of corporate communications at Nintendo, when asked about what lessons the company learned from the Wii U era. "We made various efforts with both our internal software development and with third-party partners to make this happen."

And make it happen Nintendo did. Tentpole titles like Breath of the Wild, Super Mario Odyssey and Super Smash Bros. Ultimate were eventually joined by seemingly impossible ports of Skyrim, Dark Souls and The Witcher 3. The Switch quickly became the de-facto portable indie game machine, thanks to a constant drip of beloved smaller games such as Celeste and Untitled Goose Game. Heck, even rival console maker Microsoft is supporting the Switch with such titles as Minecraft, Cuphead, and Ori and the Blind Forest. 

With a brilliant concept and a steady flow of great games, the Switch was able to surpass the Wii U's lifetime sales after just a year on shelves, and the system has sold more than 40 million units at the time of this writing. The Switch platform has also evolved, with Nintendo launching a dedicated handheld version of the console in September 2019, dubbed the Switch Lite. to capture the 3DS crowd. And with titles such as Animal Crossing: New Horizons, Breath of the Wild 2 and Metroid Prime 4 in the pipeline, it’s showing no signs of slowing down.

Nintendo might have begun the 2010s stumbling, but thanks to the Switch, it's starting the 2020s on top. — Mike Andronico

Apple iPhone 4 (2010)

(Image credit: Tom's Guide)

The number of firsts to the iPhone 4's name is almost too long to list.

This was the first iPhone with a front-facing camera and FaceTime for video calls; the first with an A-series processor designed by Cupertino in-house; the first with a high-resolution Retina Display; and the first smartphone with a glass-and-aluminum sandwich design. The rear camera was accompanied by a flash for the first time as well, and iOS 4, the operating system that the iPhone 4 launched with, was the first version to allow app multitasking.

Of course, smartphones existed before the iPhone 4. However, this was an inflection point. Modern phones owe more to the iPhone 4 than any other device. Everything we expect and associate with the mobile experience — the razor-sharp displays, the design-forward attitudes, the evolution of the phone as a symbol of luxury and craftsmanship, not to mention the very idea of a smartphone as a camera as much as a communications device — we owe all to the iPhone 4.

"The iPhone 4 is most significant to me as being Apple's first 'camera,'" Rene Ritchie, Apple analyst and iMore editor-in-chief, said. "Previous iPhones had cameras. The iPhone 3GS famously added video. But the iPhone 4 was the first in what began a long line of really camera-centric keynote presentations for new iPhones.

"Looking back, I think we'll be able to trace a direct line from the iPhone 4's emphasis on cameras to the iPhone 7 and now iPhone 11 in being really almost camera-first. And among the best in the industry," Ritchie said.

The iPhone 4 isn't just remembered fondly for what it could do; it's relevant for what it stood for. Design was always a pillar of Apple's product philosophy, but the iPhone 4's workmanship was simply on another level. Where previous iPhones predominantly used plastic, the iPhone 4 bonded two slabs of strengthened glass with an aluminum frame for the first time. It looked unlike anything else back in 2010 — it was beautiful.

"The very Leica- and Brawn-esque design blew me away when I first held it," Richie recalled. But, of course, the iPhone 4's unique construction also proved to be its most crippling flaw, because the breaks in the aluminum band necessary for the phone's antennas could be easily obscured by the user's hand.

"Antennagate is, for me, the example of the quintessential Steve Jobs' Apple," said Carolina Milanesi, Creative Strategies analyst,. "Only Jobs could have pulled off 'You are holding it the wrong way.' I always point to that when I look at how different Apple is now under Tim Cook, where the admission of missteps are a little more forthcoming."

Ritchie agreed, calling to mind the infamous Gizmodo prototype leak that preceded Apple's launch by two months as another example of the company's growing pains on its way to being the preeminent player in the mobile industry.

"I think both the left-it-in-a-bar iPhone 4 and antennagate opened Apple's eyes to the realities of being both a consumer and not just computer tech company, but also a mobile device manufacturer," Ritchie said. "They weren't handled gracefully at all, but they forced Apple to grow and understand the risks and the responsibilities that would come with their new business opportunities." — Adam Ismail

Amazon Echo Dot (2016)

(Image credit: Tom's Guide)

Since Amazon unveiled the Echo Dot in 2016, the small smart speaker has disrupted homes well beyond making parents hesitate on naming children Alexa. Priced at a comfortable $49.99, the stationary hockey puck helped propelled voice assistants into the mainstream and changed how we interact with technology forever.

"The price point was pivotal to the general adoption of voice assistants in the home," said Eric Turkington, vice president of strategic partnerships for AI-voice solutions consultancy Rain. "The Echo Dot price was a quarter of the first Echo speaker. It presented a lower barrier to entry so many people could buy them."

With the Echo Dot Amazon introduced a new experience: a low-cost, hands-free and home-based device that connects users to thousands of skills and services. Alexa's superpowers span audio providers, interactive games and organizational assistance. You can ask for traffic conditions, get a daily news briefing, create shopping lists and more. 

The Echo Dot's influence has given way to a number of smart-home developments, too. Connected gadgets like smart lights, smart locks and smart plugs make more sense to manufacture, sell and buy when it's expected you’ll have a smart speaker to control them. 

There are 35 million U.S. smart homes or residences that hold networked devices and related services that enable home automation through either voice, app or third-party hub control. In 2016, when the Echo Dot debuted, there were just 15 million reported smart homes.

"The Echo Dot has been a transformative consumer product for voice assistants and smart homes," Turkington said. "People are buying them as a way to blanket their homes in voice assistant accessibility."

In 2019 Amazon, as well as Google and Apple, faced backlash over transparency in privacy policies when it was discovered that humans review voice recordings for quality feedback. The companies have since doubled down on privacy measures by expanding user data controls. 

"Privacy has been a significant concern in the voice space, but hasn't necessarily dampened the tide of device sales," Turkington said. "We’re still seeing voice-enabled products among the top consumer sales."

Amazon's Echo offerings span sizes, prices and practical uses, but none hits the same sweet spot as the Dot. It provides smarts and solid sound in a compact, affordable package. And it only gets better with age: We gave the third-generation Echo Dot a perfect rating in our review. — Kate Kozuch