Instagram’s subtle tweak, and Samsung’s big Android tab has inarguable utility
A generative AI and chatbot layer for WhatsApp, Instagram and their other social media apps.
Tech giant Intel announced a $10 billion cost savings plan for 2025, a few weeks ago. That meant a headcount reduction of 15,000 in its global workforce (that’s around 1,25,000) as well as reduction in research spends through till 2026. Not sure if the boffins within Intel didn’t have enough AI models to predict what’d happen next, but its happened exactly as any smart human would have predicted (or advised). Apparently, employee morale (even those left behind after the latest culling) is really low. The cost-cutting plan to abolish free coffee and tea for staff, has now been revoked. A snapshot of the times we live in. A few days ago, I’d written about the shambolic state of some tech companies, and for anyone hoping that 2024 would put the brakes on a trend of the past two years, layoffs simply are surging again.
Intel isn’t the only chipmaker still trying to sort finances. Qualcomm is in a similar situation too. There are many, many others in this boat. Cloud storage businesses of Dropbox are facing significant competition from the AI infused offerings of Google, Microsoft and Box—but while I don’t know anything, I haven’t seen Dropbox try to counter with subscription price reductions and make them more enticing for users who may otherwise leave entirely. New-age business logic belies, logic.
Even as humans struggle to tame the balance sheet, there’s the spectre of artificial intelligence’s growing capabilities to replace human workers. Social media platform TikTok plans to let go of approximately 500 employees, primarily in content moderation roles that will be handled by artificial intelligence. IBM’s already detailed intentions to replace 8,000 employees with AI in coming years.
I signed off by saying tech and auto companies, amidst letting employees go, must be hoping the core of their businesses including technology, product and diversification plans, are on solid ground. If that is not the case, 2025 could bring a lot more pain.
FOOTPRINT
This is as unique as Android tablets have been. Perhaps ever. Again, it is Samsung which is spearheading attempts to try something different, make Android as a platform, more palatable to different workflows. By making a large, unique 14.6-inch screen size work as a tablet, Samsung has done exactly that to make Android relevant for creators and office usage. This screen size also makes it a direct laptop alternative, a mid-point between a 13-inch and a 15.6-inch screen, common for laptops. Some may call the Samsung Galaxy Tab S10 Ultra too big. I am not one of those people.
This is true to its “ultra” moniker. A pity then that the keyboard accessory was still not available, when I was testing this mammoth tablet. That said, what Samsung have done is incredibly likable. I spent quite some time with Microsoft Word documents, creations on Canva and edits on Adobe Lightroom, and this is an experience like no other Android tablet thus far. It also makes a case for more desktop-esque apps for Android, when a screen size (and the horsepower that’s available) such as this demands it.
There will be inevitable apprehension about Samsung’s choice of the MediaTek Dimensity 9300+ as the chip to power this flagship. The comforts of a Qualcomm Snapdragon chip, borne from familiarity (and consistency, we have come to expect). It is therefore equally surprising that this chip never stutters or stumbles, or for that matter, heats up. Beyond that, long battery run times, enough memory and storage headroom and a thickening AI envelope, adds value.
KNOW
- How annoying was it when you opened Instagram, saw a video or a post, began to interact with it, and then Instagram decided to refresh the feed without a beg your pardon? That wasn’t exactly a bug, but Instagram’s method to load new content in the background. Instagram boss Adam Mosseri now confirms that this was “really annoying” and “so we stopped doing it.” That’s positive.
- It had to happen at some point, and now is that time. After having let iPhone users find the Gemini artificial intelligence chatbot within the existing Search app, a standalone Gemini app is close to rolling out. It makes sense, if Google wants iPhone users to actively adopt not just the Gemini assistant, but also conversations with a comparatively more situationally aware and conversational Gemini Live.
INTELLIGENCE
This summer, Meta’s artificial intelligence infusion arrived in earnest (read our analysis). A generative AI and chatbot layer for WhatsApp, Instagram and their other social media apps. They had to, because it was clear Google’s plans with Gemini were leveraging Android’s scale to the fullest and Microsoft with its OpenAI trump card, weren’t far. In the initial phases, Meta AI came across as a fairly basic generative tool, one that would regale WhatsApp users for hours on end (and it did too). But little did many realise, Meta was simply not showing the Llama model’s full capabilities. Formula 1 fans would find some similarities with those “sandbagging” allegations most teams face, in pre-season testing, with claims they aren’t showing the true performance of the car to rivals.
What Meta have managed to do, is develop Llama so well, that it’s ready for use by governments. Their journey has started with the US government, in sync with the private companies that work on contracts (some names include Accenture Federal Services, Amazon Web Services, IBM, Lockheed Martin, Microsoft, Oracle, and Palantir). Llama is ready for defence and national security applications, they say.
At this point, I’d like to point you to Meta’s Llama 3 Acceptable Use Policy, in which, Section 2 Clause ‘a’ states that it mustn’t be used for “Military, warfare, nuclear industries or applications, espionage, use for materials or activities that are subject to the International Traffic Arms Regulations (ITAR) maintained by the United States Department of State.” To this, Meta clarified that the use of Llama 3 is very much on the agenda to “streamline complicated logistics and planning, track terrorist financing or strengthen our cyber defences.”
Meta has bigger plans for Meta AI. First is broader adoption across the public sector in the US. I am sure there are global aspirations, which will become clear soon enough. And then there’s the training for Llama 4. In use is a cluster of GPUs (or graphics processing units, computing hardware) that is “bigger than anything” used for any models till now. Apparently, this has more than 100,000 of Nvidia’s H100 Tensor Core GPUs in play (each of these in-demand GPUs costs around $25,000), which makes this cluster significantly larger than the 25,000 H100 GPUs used to develop Llama 3.
Some thought provoking Tech Tonic, from previous weeks…
QUESTION
Regular readers would remember our extensive coverage of Apple’s recent Mac refresh with the M4 chip. In this cycle, the MacBook Pro, Mac mini and the colourful iMac got the M4 chips, with the MacBook Pro helping introduce the M4 Pro and M4 Max chips. Where does that leave the MacBook Air, Mac Studio and the Mac Pro? There isn’t much to worry about, in case you intend to buy any of these three computing devices sometime next year—just wait through the first quarter of 2025. My guess is that in a really long time, Apple will have the entire Mac portfolio on the same family of chips (and it’d be the first time for Apple Silicon). The indications are such, particularly with a ‘Mac week’ approach, instead of a single keynote or a simpler set of press releases.