If you haven’t noticed it by now, we’re going through an AI revolution of sorts. Phones like the Pixel 8 and Pixel 8 Pro have certainly made AI-assisted features a compelling part of their story, but the Samsung Galaxy S24 Ultra could redefine how artificial intelligence could be entwined deeper in today’s smartphones when the new phone arrives next year. I’m stoked about the possibilities.
Let’s face it, phones are doing more than just the benign stuff we’ve been using them for. Just look at how Google’s latest flagships leverage Google Assistant to take phone calls when you’re unavailable, and then proceed to give your callers contextual responses. This is exactly the direction where AI needs to go to prove its enhanced awareness, and it's only going to get better (presumably) with the Galaxy S24 Ultra.
Samsung’s already laying down the groundwork for its next flagship phone, evident in a recent blog post from the company indicating how Galaxy AI will be a comprehensive mobile AI experience leveraging on-device AI and cloud-based AI. There’s also Samsung Gauss, the company’s generative AI model that will allow for greater editing creativity with photos and video.
This all sounds ambitious, and it could make the Galaxy S24 Ultra the ultimate AI-powered phone I’ve been waiting for. I’ll break down all the AI-assisted features I’d like to see, along with the likelihood of them actually happening.
Improving on Pixel 8’s Best Take feature
One of the AI-assisted features on the Pixel 8 that I've found oddly useful is Best Take, the post editing feature that lets me choose the faces of the people in my photos, resulting in picture-perfect photos all the time. The problem is that this feature only works when you shoot multiple photos of the same scene, so that the Pixel can have a better pool of faces to swap out.
The Galaxy S24 Ultra could use AI to improve this feature so I’m not constantly forced to snap multiple sets of photos. Just one and done, while still providing me with several other face options to choose from.
Outlook: Very possible
Generative AI for editing on the go
Again, Google’s Pixel phones have some of the best native photo editing features I’ve seen on a phone. Magic Editor lets me remove unwanted objects/subjects in my photos, while its sorcery lets me move stuff with ease. That's leveraging generative AI to fill in gaps on a photo, resulting in a realistic image that would be hard to tell whether it's been edited.
Samsung Gauss could be a big leap in terms of how generative AI could transform photo editing as a whole. I would suspect that a combination of on-device and cloud-based AI would be needed to let me type what changes I’d want in a photo — like swapping out the type of sneakers on someone’s feet, or changing up the background to something entirely different.
Adobe Firefly, the machine learning model behind Photoshop’s generative fill feature, already does this. Of course, Photoshop on a PC is a far more comprehensive photo editing tool than anything on mobile, but Samsung could at least offer some similar features to start off.
Outlook: A lot of work, but possible
Actually making phone calls for me
I absolutely love the Pixel 8’s Call Screen feature. It’s probably the reason why I’ll never pick up phone calls again because the Pixel handles that capability so well. The Samsung Galaxy S24 Ultra could make a leap by taking the next logical leap — make phone calls for me.
It’s hard to say if Bixby would be the voice behind these phone calls, but it would take a lot of processing power to achieve with the Galaxy S24 Ultra. I presume that a basic set of commands could be available to initiate phone calls, such as making dinner reservations or calling someone because you’re running late. However, I imagine that AI’s real worth is in how it could provide contextual responses through conversations with people on the other end of the phone line.
Outlook: Maybe not
A real personal assistant that can take action
Right now as I look down at my phone, I see that I have an upcoming meeting in about 10 minutes. If I didn’t look down for that split second, I would’ve totally missed it. This is a perfect example of how a phone could actually act like a real personal assistant to me — by keeping me informed and up to date with my schedule.
Perhaps the Galaxy S24 Ultra could know if an upcoming meeting notification wasn’t seen, and start buzzing to get my attention. Another example would be if I'm running late for a meeting because I’m stuck in traffic — AI on board the Galaxy S24 Ultra could suggest sending an email or text message to the individuals I’m meeting, all based on my phone’s GPS coordinates and the current time.
I’m not asking for Samantha in Spike Jonze’s 2013 film Her, but at least give me an assistant who can intelligently be aware of my daily routine and schedule.
Outlook: Very possible
Storytelling with my memories
I’m looking forward to trying out the Journal app with my iPhone, which should be coming with the public release of iOS 17.2. Apple’s leveraging AI to curate suggestions based on the music you listen to, the places you’ve visited, and including the photos you’ve captured, so I think the Galaxy S24 Ultra could offer something similar.
I’m all for the Galaxy S24 Ultra creating more dynamic memories with the help of video, but I think it could go the extra step of creating these memories by also analyzing the conversations in the videos I capture and extract what’s being said — and who said them. AI could automatically analyze the audio and footage, then suggest creating short clips that highlight some of my recent memories.
Outlook: Very plausible