Welcome episode 228 of the Cloud Pod podcast – where the forecast is always cloudy! This week your hosts Justin, Jonathan, Matthew and Ryan are taking a look at Magic Quadrant, Gemini AI, and GraalOS – along with all the latest news from OCI, Google, AWS, and Azure.
Titles we almost went with this week:
- The CloudPod wonders if Anthropic’s Santa Clause will bring us everything we want in an AI Bot.
- The Cloud Pod recommends protection to achieve Safer
- Google rides the gemini rocket to AI JPB
- The only Copilot I need Azure, is Booze
- GraalOS, or what we now call ‘the noise our CFO makes when he receives the Oracle audit bills’
- The hosts of the Cloud pod would like to understand how to properly pronounce GraalOS
- Is Oracle even on the magic quadrant for cloud?
- RedHat Puts lipstick on the pig and calls it OpenStack
A big thanks to this week’s sponsor:
Foghorn Consulting provides top-notch cloud and DevOps engineers to the world’s most innovative companies. Initiatives stalled because you have trouble hiring? Foghorn can be burning down your DevOps and Cloud backlogs as soon as next week.
📰General News this Week:📰
- “By integrating Kubernetes with OpenStack, organizations see improved resource management and scalability, greater flexibility across the hybrid cloud, simplified development and DevOps practices and more,” said Sean Cohen, director of product management in Red Hat’s Hybrid Platforms organizations.
- Per Holger, Mueller openstack has gotten a lot of popularity in the Telecommunications industry where they use it to build private clouds to run their networks… *adds to the list of don’t work there… telecommunications companies*
02:32 📢 Justin – “I mean, OpenShift is just like Convox. It’s a platform on top of Kubernetes and a fancy developer portal. And so then you get, now you add to that OpenStack.”
- Amazon is investing up to $4 billion in Anthropic. The agreement is part of a collaboration to develop the most reliable and high-performing foundation models in the industry.
- As part of the agreement, AWS will become Anthropic’s primary cloud provider for mission critical workloads, providing our team with access to leading compute infrastructure in the form of AWS Trainium and Inferentia chips, which will be used in addition to existing solutions for model training and deployment. Together, they’ll collaborate on future Trainium and inferential technology.
- Based on AWS customer demand for Claude, Anthropic will expand their support for Amazon Bedrock.
- Amazon and Anthropic are committed to the safe training and deployment of advanced foundation models.
- Amazon will take a minority state in Anthropic. Yes. A $4 billion dollar investment gets you a minority stake. Inflation, amirite?
- Amazon has found their OpenAI?
04:58 📢 Jonathan – “It sort of begs the question of if Microsoft hadn’t partnered with OpenAI, would Amazon have partnered with them first? Or is this a reaction to the Microsoft OpenAI deal, or is this what they actually wanted and kind of planned all along? I don’t know. I do like what they’re building though. Claude is totally different than ChatGPT in the way it’s trained and the way it works, and it solves a lot of the problems that ChatGPT has right now.”
Listener Poll: Which LLM do you think Oracle is gonna buy? Let us know what you think!
- EKS now supports Kubernetes 1.28, and this is our time to talk about Kubernetes since Amazon is doing it. Never mind that everyone else has been supporting it for a month.
- New things in K8 1.28 you can get in the clouds
- K8 1.28 introduces a more lenient version compatibility policy for its core components, which expands the supported skew between the K8 API and the Kubelet.
- Stateful workload enhancements are now stable
- Justin is threatening once again to run SQL Server on EKS. Someone warn Cody.
- Advanced topology management and fine-tuned pod placement has reached beta for those who want to micromanage pod placement.
- P2 Instance deprecation on AWS. Go find the new P3 instances.
10:16 📢 Jonathan – “No one’s added AI to Kubernetes yet. Maybe they, I mean, other than, I guess, GitHub co-pilot and all the other coder helpers can now be right to Kubernetes scaffolding, I guess. But yeah, can someone write an AI that’ll manage Kubernetes or is that just a bridge too far for AI…Kubernetes might be why the AI would actually want to get smart enough to kill us all.”
- US Regulators and 17 states are suing Amazon over allegations the e-commerce behemoth abuses its position in the marketplace to inflate prices on and off its platform, overcharge sellers and stifle competition.
- This is the result of a year-long investigation by the FTC.
- The suit alleges that company is anti-competitive through measures that deter sellers from offering lower prices for products on non-amazon sites
- While this is focused primarily on the E-commerce side, the reason they can undercut their partners and competitors is driven by the massive profits of AWS.
- This could result in an outcome that forces the breakup of Amazon.
- To win this case the justice department has to prove that Amazon is a monopoly in specific markets. (Online superstore market and the online marketplace services market)
- And that Amazon has used their monopoly to harm consumers and competitors. Via allegedly employing exclusionary anti-discounting conduct for artificially boosts price and its rules for sellers to “coerce” them into using its fulfillment services.
- Google is reportedly getting into the LLM game, and plans to release their LLM to compete with GPT-4 with the Gemini AI model.
- Gemini comprises a set of large-language models, which can power everything from chatbots to features that summarize text to generate original text – such as email drafts, song lyrics, news articles based on descriptions of what users want to rad.
- The stakes are high for this model to be competitive with the Open AI GPT models.
- The model will end up in everything AI at google including Bard, Duet products, and future AI powered innovations.
- From someone who tested Gemini, it has one big advantage over GPT-4 and that is it leverages Google’s proprietary data from its consumer products in addition to public information on the web. As a result, the model should be especially accurate when it comes to understanding users’ intentions with particular queries, and it appears to generate fewer incorrect answers.
- In addition Gemini will be available through Google Cloud Vertex.
17:03 📢 Jonathan- “Yeah, in comparison, if the rumors are true about Gemini, the size of the model is absolutely enormous compared with anything that opening eyes done. I think their CHAP GPT, GPT-4 model is like 130 billion parameters. And I believe the rumor for the Google Gemini is somewhere between, you know, it’s greater than a trillion parameters. And so there’s a lot of money gone into training that. And if it’s true, then it’s gonna blow everything else out of the water.”
17:57📢 Ryan – “At a certain point, all these providers are going to have to actually try to make the money off of these things instead of trying to build out the datasets by offering it for free. And it’s going to be a very interesting change.”
- Gartner has recognized Google as the leader in the 2023 Gartner magic quadrant for container management… at this point, why don’t they just call it Magic Quadrant for Kubernetes?
- Google is in the top right position, although technically, Microsoft is a little farther right of them.
25:53 📢 Jonathan – “The thing that bugs me about it is it’s not evidence-based. They’re not going off doing their own research. It’s basically they’re polling customers based on their interactions with those products in the different clouds.”
- Microsoft is finally just leveling that everything for AI will be called Copilot, your everyday AI companion.
- Copilot will uniquely incorporate the context and intelligence of the web, your work data and what you are doing in the moment on your PC to provide better assistance — with your privacy and security at the forefront.
- Copilot will be rolling out as part of Windows 11 on September 26 as long as across Bing, Edge and with M365 this fall.
- Having set up my new phone, can they use AI to copy my Outlook signature to all my computers and phones so I don’t have to enter it everytime?
- Windows 11 users will get AI in Paint, photos, snipping tool, clipchamp, notepad, outlook for windows, modernized file explorer, new voice access for text authoring and new natural voice narrators and windows backup all with AI.
28:19📢 Justin – “Windows 11 users are excited to know that you’ll get AI in Paint, for those of you who still use Paint regularly. You’ll also get it in Photo, Snipping Tool, Clipchamp, Notepad, Outlook for Windows, Modernize File Explorer, New Voice, Access for Text Authoring, and a new Natural Voice Narrator. And Windows Backup will come with AI, which I would never trust. Windows Backup, so that’s cute”
- Azure is announcing the public preview of HDInsight on AKS, their cloud native, open-source big data service, completely re-architected on AKS with two new workloads and numerous improvements across the stack.
- HDInsight on AKS includes Apache Spark, Flink, and Trino on Azure K8 infrastructure and feature deep integration with popular Azure Analytics services like Power BI, Azure Data Factor and Azure monitor.
- At Oracle World 2023, Oracle announced GraalOS to power their cloud-native runtime technologies, particularly functions. OCI Functions with GraalOS can enable serverless functions to launch in seconds and use up to 50% less memory for most workloads as compared to traditional functions
- The faster a function startup occurs, the less need to provision concurrency.
- Initially, this is available for Java only, with other languages getting the support later (are you just copying AWS here?)
- GraalOS is a faster and more efficient cloud runtime that uses the latest processor architectures to deliver higher performance using fewer resources with its native image ahead-of-time compilation technology, build inapplication into a standalone native machine executable, which includes only the code required for runtime processing.
- It excludes unused classes, methods and files from the executable.
- Benefits include:
- Ultra-fast cold starts
- Less memory required
- Out-of-the-box integration with cloud services.
36:25 📢 Justin – “I think AWS also had something similar to this where they, they did a faster cold start problem and it was like, it only works with Java initially. I haven’t seen them extend that to beyond Java. I don’t think, and I don’t think Oracle ever extended this either.”
36:51📢 Matthew – “It amazes me. It just amazes me how much is still written in Java.”
- Oracle missed expectations and provided lower revenue guidance resulting in their stock sliding
- Income for the first quarter was 2.42 billion, rising from 1.55 billion a year earlier. Earnings before costs such as stock comp were 1.19 per share, ahead of the 1.15 expected.
- However profitability came in at just 12.45 billion vs the expected 12.47 billion.
- Oracle is expecting it not to get better in the second quarter.
- Despite other issues cloud is a bright spot, with cloud revenue at 1.5b up 66% from a year earlier.
- Faster than others but slower than the 76% reported last quarter.
- Please note, Oracle does their earnings WAY later than everyone else, that’s why they’re not included in our regular earnings show.
And that is the week in the cloud! We would like to thank our sponsors Foghorn Consulting. Check out our website, the home of the Cloud Pod where you can join our newsletter, slack team, send feedback or ask questions at theCloud Pod.net or tweet at us with hashtag #theCloud Pod