Welcome to The Cloud Pod episode 231! This week Justin and Matthew are discussing updates to Terraform testing for code validation, some new tools from Docker, look into the now generally available AWS DataZone, and dig into the evolution of passkeys over at Google. Slide into the passenger seat and let’s check out this week’s cloud news.
Titles we almost went with this week:
- 🧑💻The Cloud Pod wants to validate your code
- 📊The Cloud Pod can now test in parallel
A big thanks to this week’s sponsor:
Foghorn Consulting provides top-notch cloud and DevOps engineers to the world’s most innovative companies. Initiatives stalled because you have trouble hiring? Foghorn can be burning down your DevOps and Cloud backlogs as soon as next week.
📰General News this Week:📰
01:17 Terraform 1.6 adds a test framework for enhanced code validation
- At Hashiconf this week, they announced Terraform 1.6 is now available for download.
- The most exciting feature? We’re so glad you asked!
- The new terraform test framework that deprecates and replaces the previous experimental features added in 0.15.
- Terraform test allows authors to consistently validate the functionality of their configuration in a safe environment. Tests are written using familiar HCL syntax, so there is no need to learn a new language to get started.
- Config-Driven import introduced in Terraform 1.5 gets improvements to support variable driven ID attributes. Making it easier than ever to import existing items.
- Cli Improvements
- Several changes are coming to the S3 Backend remote state in this release to better align with the SDK and the official terraform AWS provider.
- It should still work but you may receive warnings about deprecated attributes. May the odds be ever in your favor.
- You can check out the Testing Terraform overview page here, or the Write Terraform tests tutorial here.
03:22 📢 Justin – “ One of the interesting things that, you know, that wasn’t part of this particular announcement is that they’re also adding an ability to use AI to help you with your test cases. And so basically the model, they built an LLM model to specifically trained on HCL and the Terraform test framework to help model authors begin testing their code.”
04:55 Docker debuts new tools for developing container applications
- Docker has released two new offerings: Docker Build and Docker Debug
- These tools will help software teams develop containers faster.
- Docker build aims to simplify the so-called build process or the task of turning raw code files into a container image. Building images can take up to an hour in some cases, but with docker build you can speed up the tax by a factor of up to 39.
- It does this by performing many of the computations involved in the process on speedy cloud-based servers, which can process code faster.
- Docker Debug aims to ease the task of finding and fixing code issues in container applications.
- Often applications written in different languages have to be troubleshooted using different debugging tools. Developers likewise use separate tools for containers running in production vs local machines.
- Debug provides all the debugging tools developers require in a single integrated package to reduce complexity.
03:22 📢 Matthew – “I think the only time I’ve had a container take more than 15 minutes to build is when I was compiling Ruby into a container, and source compiling it from scratch.”
AWS
08:09 Amazon DataZone Now Generally Available – Collaborate on Data Projects across Organizational Boundaries
- Amazon Datazone is now generally available.
- Datazone is a new data management service to catalog, discover, analyze, share and govern data between producers and consumers in your organization.
- With Amazon DataZone, data producers populate the business data catalog with structured data assets from the AWS Glue Data Catalog and Amazon Redshift tables
- Data consumers search and subscribe to data assets in the data catalog and share with other business use case collaborators.
- Consumers can analyze their subscribed data assets with tools such as Redshift or Athena.
09:51 📢 Justin – “The challenge is that as the data warehouse team is converted to data lakes, the matter of data has just blown up exponentially. And so the ability for them to do hand holding and things like that is really difficult. And so by being able to publish known data catalogs and then tell end users like, hey, yeah, just point your Excel at this or point your own Redshift cluster at it. You’re now democratizing and giving federated access to these things, but across control areas where you can really manage the governance of it, um, as well as data authentication and different things.”
10:43 Amazon EC2 C7a Instances Powered By 4th Gen AMD EPYC Processors for Compute Optimized Workloads
- New Amazon Ec2 C7a instances powered by the 4th Gen AMD EPYC processors with 3.7ghz frequency and 50% percent higher performance compared to the c6a.
- ½ to 192/384
GCP
16:26 Passwordless by default: Make the switch to passkeys
- Google is rolling out support for passkeys and moving away from those pesky passwords, with the goal of making your account more secure than ever.
- To use a passkey, you just use a fingerprint, face scan or pin to unlock your device, and they are 40% faster than passwords.
17:18📢 Matthew – “It’s a great next step; because let’s be honest – passwords are the bane of everyone’s existence.”
18:50 Google Cloud Public Sector UK: Helping government adapt to a digital future
- Google is announcing Google Cloud Public Sector UK, a new division dedicated to helping government departments and agencies across the UK transform their operations with hyperscale cloud capabilities.
19:17📢 Justin – “I think there’s a lot of interest from public sector companies or agencies all over the world who want access to more and more cloud resources and this makes your life easier.”
19:54 New Vertex AI Feature Store built with BigQuery, ready for predictive and generative AI
- The new Vertex AI feature store supports data engineering, data science and ML worklos and is in public preview.
- Feature store is fully powered by your organization’s existing bigquery infrastructure and unlocks both predictive and generative AI workloads at any scale.
- Feature store is a centralized repository for the management and processing of ML inputs, also known as features.
21:06📢 Matthew- “Just the ‘hey we’ve done it once we don’t need to regenerate it multiple times’ and have it be exposed to multiple teams or departments or whoever it is; that right there is gonna be key and really help everyone.”
21:46 AlloyDB Omni, the downloadable edition of AlloyDB, is now generally available
- AlloyDB Omni is now GA, a downloaded edition of AlloyDB which offers a compelling choice for workloads, providing the flexibility to run the same enterprise-class database across their on-premise environments or even other clouds or developer laptops.
- AlloyDB omni even includes support for AlloyDB AI, an integrated set of capabilities built into Alloy DB for PostgreSQL, to help developers build enterprise grade gen AI apps using their operational data.
- They are also launching in preview the AlloyDB Omni K8 operator, which simplifies common database tasks including database provisioning, backups, secure connectivity and observability.
- Alloy DB Omnis is available with a monthly subscription for a 16 VCPu starter pack at $1295 per month, in monthly subscriptions of 100 vcpu blocks for 7k a month with discounts for 1-3 year commits.
22:42📢 Justin – “ I was kind of disappointed they didn’t just make this open source because, you know, Postgres is already open source and if you could, if AlloyDB had a significant advantage over Postgres, they could basically start dominating all kinds of workloads that are living on Postgres today and then just migrate them into GCP when you wanted to no longer manage those things. Very similar to what Azure does with SQL.”
24:26 Google makes its Cloud Spanner database service faster and more cost-efficient
- Google LLC introduced a new version of Cloud Spanner, one of its managed database services, that will enable customers to process their information faster and more cost efficiently.
- The new cloud spanner focuses on improving the database’s performance, according to google the database provides 50% higher throughput at the same price.
- This gives Cloud Spanner an edge over Amazon DynamoDB.
- You can also now store up to 10tb of data in cloud spanner up from 4tb – and cause yourself a lot more pain when it all goes wrong.
Azure
- You can now use CMK (customer managed keys) at the database level with Azure SQL.
- Previously you could only apply encryption at the host level, but with this new capability you can add encryption keys at the database and host level.
28:07 📢 Matthew- “Yeah, so when you’re running a multi-tenant solution, having per database keys is definitely a little bit more preferred. So before you had to have it at the whole SQL level, so if you wanted to have different keys you would have to launch different SQL servers… so this for me at my day job is going to be extremely beneficial.”
28:33📢 Justin – “Surprised this didn’t exist already, because I think, I’m pretty sure an on-prem SQL server that is Azure SQL, you can deploy TDE on a per database thing…”
29:15 Generally Available: Azure Dedicated Host – Resize
- Azure Dedicated Hosts are a type of Azure service that provide dedicated physical servers for your workloads.
- Azure Dedicated Hosts can now be resized, allowing you to change the size of your host to meet your changing needs.
40:13 Announcing Microsoft Playwright Testing: Scalable end-to-end testing for modern web apps
- Other news from the terrible names department: Microsoft Azure Playwright Testing is a new service that enables you to run Playwright tests easily at scale. Playwright is a fast-growing, open-source framework that enables reliable end-to-end testing and automation for modern web apps. Microsoft Playwright Testing service uses the cloud to enable you to run Playwright tests with much higher parallelization across different operating system-browser combinations simultaneously. This means faster testing and quicker troubleshooting, which helps speed up delivery of features without sacrificing quality.
- Here are some of the key benefits of using Microsoft Azure Playwright Testing:
- Scalability: Azure Playwright Testing can scale to meet the needs of even the largest and most demanding web applications. You can run thousands of tests in parallel across multiple operating system-browser combinations, which can significantly reduce the time it takes to complete your test runs.
- Reliability: Azure Playwright Testing is built on top of the Microsoft Azure platform, which is known for its reliability and performance. You can be confident that your tests will run smoothly and consistently, even when you are running a large number of tests in parallel.
- Ease of use: Azure Playwright Testing is designed to be easy to use. You can get started quickly without having to make any changes to your existing Playwright test suite.
31:15 📢 Matthew -”I mean, if you’re already using the tool, adding on the ability to do unit tests and launching – from my understanding it will actually launch web browsers and do multiple testing on OS and web browser and kind of mix and match and make sure it all works. The parallelization of it is definitely going to be key. And if it works, great.”
Closing
And that is the week in the cloud! We would like to thank our sponsors Foghorn Consulting. Check out our website, the home of the Cloud Pod where you can join our newsletter, slack team, send feedback or ask questions at theCloud Pod.net or tweet at us with hashtag #theCloud Pod