News of the Week (September 25 - 29)

News of the Week (September 25 - 29)

Hey guys. I’ve been pretty sick with Covid all week. If there are a few extra typos in here, I apologize in advance. I’m quite out of it.

1. Meta Platforms (META) — Meta Connect 2023

Subscribe now

Founder/CEO Mark Zuckerberg and his team took to the stage to share some exciting developments within Quest and AI. Here, I’ll condense that hours-long event into a few minute article on the highlights that you need to know.

On Quest 3’s Approach vs. Apple:

Meta announced Quest 3 AI headset this week. It’s 40% thinner than Quest 2 and will start at $499 -- less than 15% of the cost of Apple’s Vision Pro Headset. That’s an important point. The two companies are approaching this hardware disruption in opposite ways. Meta is selling at cost to drive access and ubiquity. It’s a classic case of commoditizing one piece of a value chain to ensure dominant market share… then leveraging that share for profits later and elsewhere. Apple will sell the hardware with its typical gaudy gross margins and lean on its sticky ecosystem to drive adoption. Both approaches should work. There’s likely a future duopoly forming.

“It’s not just our job to make these things possible. It’s our job to make them affordable and accessible.” -- Founder/CEO Mark Zuckerberg

On What’s New for Quest 3:

Quest 3 will be the first “mainstream mixed reality headset” to market. When putting the device on, you’ll see your physical space. From there, you can augment that environment with digital objects suspended in your actual room. Whether it’s watching sports on a giant virtual screen (or sitting courtside through X stadium), playing games on interactive holograms, or practicing surgery in a risk-free environment, real use cases are abundant.

The headset comes with 10x more pixels vs. its Quest 2 predecessor to ensure graphics and latency-reductions are as optimal as possible. Beyond more pixels, the infused next-gen Snapdragon chip from Qualcomm and a new pancake-style lens enhance graphics further. That enhancement paves the way for a new feature: The mixed reality hardware can map your actual space to better merge physical and digital worlds. Zuck offered examples of what this will mean:

  • Throwing a digital ball at a physical wall and having it bounce off that wall as a real ball would. Not interesting.
  • Using your actual couch in a room to take cover while playing a social shooting game. More interesting.

Generally speaking, better graphics means better gaming quality. This is important considering gaming is currently the most popular use case for these headsets. That likely will be the case for a while. While Apple will surely add more content over time, Meta runs laps around them in terms of size and quality of its gaming library. That library just got better:

  • Microsoft’s Xbox will integrate with Quest 3 for the first time. This will bring hundreds of iconic Xbox gaming titles to this hardware form factor.
  • As constantly rumored, Roblox’s gaming suite will be optimized for Quest 3 and integrated as well.

Still, while gaming is king of headsets for now, Meta is hard at work on building enterprise use cases. As previously announced, it will integrate Microsoft 365’s productivity apps. Much more to come here.

Finally on Quest 3, there was a popular Lex Freidman interview with Zuck this week. It showed off Meta’s hyper-realistic Kodek avatars and how the Metaverse will create a sense of remote connection. This was mostly review/expected for those of us who have been closely tracking developments.

On Artificial Intelligence & Chat Bots:

While LLAMA 2, Meta’s open source large language model (LLM), garnered much hype, Meta has a lot more innovation in the docket. It announced Expressive Media Universe (“EMU”) as a newer model to generate images from texts in 5 seconds (not minutes). It is building this directly into all of its chat functions (WhatsApp, Messenger etc.) to upgrade things like Stickers. I toyed around with the tool this week and candidly it’s quite limited. That’s to be expected initially and quality will only improve from here. With EMU, stickers will transform from a static library of options to a generative AI-powered tool to create whatever image your mind desires.

Beyond EMU, Zuck spoke at length about “Meta AI.” This is a chatbot assistant to field basic queries and is based on LLAMA 2. It pulls from a partnership with Microsoft’s Bing Chat (noticing a pattern?) to infuse real-time, current information. That feature eliminates outdated responses.

META AI will be open sourced and open to 3rd party developers in the coming months to build and customize on top of.

For more specific model use cases, Zuck detailed some fun additional models:

  • Max the Chef helps you use your available food to cook up recipes.
  • Lily is its editor AI model.
  • Lorena is its travel concierge model.
  • Victor (played by Dwayne Wade) is its personal trainer model.
  • Snoop Dog is its role play gaming model.
  • Bru (played by Tom Brady) is its sports trivia model.
  • Kendall Jenner is its big sister advice model.
  • Mr. Beast is its Funny Man Model.

These models do not have access to real-time data like Meta AI. The company is working to add that feature in the coming months.

To make 3rd party developer life even easier, Meta is building “AI Studio.” This will function as its open-source platform to enhance and customize Meta’s models for more granular use cases. It will provide intuitive processes for building on top of Llama 2 and other Meta LLMs. It’s also building a sandbox to ensure this can be done in a no-code manner. Businesses will be able to integrate these custom models into their own customer service functions to enhance quality and speed of interactions. This test launched with just a few clients today and will likely take a full year to be rolled-out.

Why do Chatbots Matter?

Meta will monetize chatbots through business use cases like automated customer service. That, however, is not the most exciting part about chatbots. Instead, the thing I find most promising is the indirect lift to app engagement. The longer Meta keeps its users entertained on these apps, the more incremental and relevant data it has to use. Why does that matter for an ads-based company like Meta? More data means more algorithm training to uplift its targeting capabilities. This will mean more valuable placements and a larger number of placements thanks to the added time spent. That’s the holy grail for Meta.

This theoretical boost must play out for the heavy costs associated with model training, inference and maintenance to be justified. LLMs are expensive. Meta thinks it will play out (so do I).

On Going Slow with AI Innovation:

Meta is intentionally slow playing the roll-out of these models. Why? It wants to make absolutely sure that it has the proper guardrails, self-regulation and rules in place before giving this to the masses.

On its New Ray-Ban Smart Glasses:

Zuck revealed the next generation of Meta’s smart glasses. The new Quest headset will likely approximate this look over time. And that’s necessary if broad adoption is to be realized. For now, the glasses offer some bare bones products and a comfortable form factor; the headsets do a lot more but candidly are not comfortable for extended wearing.

The new glasses come with better cameras, better audio and a lighter frame. They also allow for live streaming the field of view to followers. Zuck walked us through an example of Formula 1 driver Charles Leclerc taking us on a race with him for an idea of how this could be relevant.

The new glasses will also be fully equipped with Meta AI which facilitates some cool use cases. More use cases will come next year after a planned free software update to make the glasses multi-modal. Use cases described by Zuck include:

  • Using the glasses to replay a pickle ball shot to see if it was in or out. That could have solved some major fights with my siblings growing up on the tennis court.
  • Ask your glasses how much more time that steak on your barbecue needs to cook.
  • Translate that sign for me and tell me what building I’m looking at. This would’ve made school field trips a lot more fun.

As Zuck explained, the best way to train AI models is to show them exactly what a person sees and hears. The Meta AI integration means this training can be taken to a new level within the comfortable, Ray-Ban hardware. These will launch next month starting at $299.

Conclusion:

The near-term value of these AI models is clear. Outside of AI, none of this will be material to Meta’s financials in the near term. Still, I find all of this to be exciting as a long-term shareholder. Meta is innovating where needed to ensure it’s a primary piece of the future of social media the next computing form factor. It’s laying the groundwork to ensure that it continues to be the disruptor rather than the disrupted. Fun event.

2. Amazon (AMZN) – Anthropic & More

a. Anthropic

Subscribe now

Amazon announced a $1.25 billion equity investment in Anthropic. It has the option to invest another $2.75 billion and Google is a smaller investor here as well. Anthropic is a generative AI research firm and model builder with founders from OpenAI. And speaking of OpenAI, the two are often compared while the direct competitors are broadly seen as the cream of the crop in this niche. Microsoft invested $10 billion in and partnered with OpenAI earlier in the year, and this is Amazon’s response.

As part of the arrangement, AWS will become Anthropic’s primary cloud provider. It’s not the exclusive provider like Microsoft Azure is for OpenAI. Amazon’s Bedrock (and AWS customers) will get early access to Anthropic’s new models like Claude 2 while the two work closely to build more models. Bedrock is Amazon’s internal foundational model that 1st and 3rd party developers build on top of; Anthropic will make its own foundational models available here to further customize.

Anthropic will use Amazon’s Trainium and Inferentia chipsets as part of the deal. As the names imply, these are Amazon’s chipsets internally built to train and sharpen model inference more cheaply. As an aside, that’s where I see AWS (and possibly Google) potentially standing out from Azure if these chipsets prove successful. They allow Amazon to enjoy more efficiency-fostering vertical integration with more revenue generation opportunity.

Like OpenAI and Microsoft, this is a match made in heaven. Very few companies have the infrastructure to rationally host generative AI models at scale – doing so is wildly expensive. Amazon (like Microsoft/Google/Meta) is one of those companies. This merges that strength with the world-class models being built within Anthropic. Pairing those models with Bedrock and the rest of Amazon will surely create broader, higher quality use cases similarly to what Microsoft did with OpenAI and Bing. Now Amazon can fixate on building the best infrastructure and quality chipsets while using Anthropic (and Hugging Face through a separate investment) to partially alleviate allocating finite resources to model building.

I’ve heard leadership interviews indicating that Amazon wanted to do ALL of this internally. There’s no reason to do so. Just building out a few pieces internally will still leave Amazon with a gigantic addressable market. And it will speed the pursuit of that addressable market with a higher probability of success. There was no reason to make this harder than it needed to be or to welcome Anthropic with open arms. AWS customers are demanding rapid product enhancements while this will materially speed its innovation cycle. Tighter focus… better models… happier customers… good decision.

b. More

Amazon will infuse ads into Amazon Prime Video next year. This is yet another high margin lever for the company to pull.

The FTC and 17 Attorneys General sued Amazon over monopolistic claims surrounding Prime Subscription practices. It accused the giant of using its market power to strike unfair deals with subscription partners. To be candid, this is irrelevant to me. It will probably result in nothing material. In the VERY slim chance Amazon is forced to break-up, the sum of the parts is likely worth more than the whole. That would simply mean a nice pay day for shareholders and me moving on to other opportunities. That’s an easy worst-case scenario to stomach. I added this week following the volatility based on this headline.

Amazon made Bedrock fully available to developers while adding Meta’s Llama 2 model. AWS will be Meta’s application programming interface (API) partner for this open sourced model. AWS is the first cloud-hosted foundational model product to offer Llama 2. Amazon also debuted its Generative Business Intelligence tool to turn conversational language into graphs, and code (with the help of CodeWhisperer).

3. Uber (UBER) – Europe & More

The News:

Subscribe now

The European Commission (EC) and Parliament are now negotiating gig worker protection legislation passed by EU member states this summer. The legislation is called the “Platform Work Directive” and is something for investors to watch.