AWS ReInvent 2019: A Recap
Numerator provides education and training resources to our R&D engineering staff so they can continuously evolve and adapt to our business needs against the ever-changing software development landscape. Often this translates into tech conferences, and the DevOps teams' conference of choice is AWS ReInvent. AWS ReInvent is an amazing conference to attend, even though you can get the majority of session talks after the fact, online, through legitimate sources.
What you can't experience by watching online is the week-long water cooler experience that is AWS ReInvent. You will run into peers from the industry, tossing around ideas and catching up. You'll hear some interesting things at the lunch tables. You'll meet up with old co-workers whose LinkedIn status has changed since you've last spoke and they've moved onto <Insert Big Name Company Here>. Attending ReInvent in person provides you with real-time, crowdsourced feedback against the concepts or products on stage. Experiencing the event and reading the room's reactions is powerful feedback to assess real-world adoption of the tool or idea being presented.
This is also where the experts are. Vendors bring out their best sales and professional service staff to ReInvent's vendor floor. With a simple scan of your attendee badge, you get a "best foot forward, fast" demo of all the vendor's products and upcoming releases. Though the floor can be a bit hectic, where else can you see the latest demos and sneak peeks, chat with their sales engineer experts, and walk away with fun SWAG?
AWS makes sure that their own experts are easily accessible in a variety of places like chalk talks, AWS hero events, Jam sessions, builders fair, or on stage and available for Q/A after a session. There is always an opportunity to meet and chat with the greatest minds in your field at ReInvent.
I had an awesome workshop experience this year. AWS ReInvent has gotten it right with their workshops. Attendees are given a slip of paper that logs them into their very own, private, 24-hour self destruct, locked down, AWS account pre-baked with workshop requirements, such as pre-configured IAM user accounts, VPC, and ec2 instances, etc.
What doesn't work? AWS is still working on managing the size of their conference, spread out amongst four different hotels; I missed two workshops by just minutes due to struggles with traffic and getting from one hotel to another. I wasn't too bummed when I missed out on a session or workshop because AWS has done well to simulcast their popular, large-room session talks into "overflow" rooms hosted in each session hotel.
The overflow rooms were how I was able to get catch a few good talks I wouldn't have had the chance to see when I did miss out on a workshop. I enjoyed catching Adrian Cockroft's ARC203 - Innovation at speed (AWS re:Invent 2019: Innovation at speed (ARC203)), where he goes over how to model and address common blockers for business agility and innovation. Though not a technical talk, it is still a must-watch for DevOps practitioners.
For DevOps and software developers getting started with serverless and Lambda, I personally recommend Heitor Lessa's talk "ARC307-R Serverless Architectural Patterns & Best Practices". This is a 300 level talk and doesn't get bogged down explaining obvious concepts, with fastidious links and references for those inclined to dig deeper. With each concept, he gives a quick POC using AWS well-architected framework. https://twitter.com/heitor_lessa/status/1203360908485054465?s=20 I know a few software developers on staff who would benefit greatly from his talk :)
From a security perspective, I was most impressed with AWS's announcements of Provable Security in AWS (https://aws.amazon.com/security/provable-security/) with IAM access analysis (https://aws.amazon.com/iam/features/analyze-access/). The boto3 libraries were updated the day this was announced, and you can use IaC techniques to manage and consume access analyzers (see: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/accessanalyzer.html). The related session, "Provable access control: SEC343-R", starts with the business requirements and challenges faced by AWS's security team, and then digs deeper into the how they built a mathematically provable security tool using automated reasoning and hiring wicked smart people. You can find that talk here: AWS re:Invent 2019: Provable access control: Know who can access your AWS resources (SEC343-R). The research paper behind the technology is explained in "Reachability Analysis for AWS-based Networks", found here: https://d1.awsstatic.com/whitepapers/Security/Reachability_Analysis_for_AWS-based_Networks.pdf
I noted a common theme among a lot of the keynotes and the session talks I attended: An adherence and call-out to the pillars of the AWS well-architected framework (https://aws.amazon.com/blogs/apn/the-5-pillars-of-the-aws-well-architected-framework/). Some of the session talks I attended walked the audience through the evolution of a product, and the tradeoffs to consider, as they focused on each pillar in turn and how it impacted the product's architecture. For those who develop cloud-based applications and architectures, it has become vitally important to know how to model your application against the AWS well-architected pillars.
I was greatly impressed when AWS announced "S3 Access points" https://aws.amazon.com/about-aws/whats-new/2019/12/amazon-s3-access-points-manage-data-access-at-scale-shared-data-sets/. As customers use S3 for more data-lake needs, it has become difficult to manage the one-to-many relationship of the differing access rights from one application or role to another. With S3 Access points, we can now automate a process of creating S3 access points against different applications. Policy and access rights can now be managed against a unique endpoint. For example, we can further lockdown S3 buckets against public access but permissively opened to "Production" applications in specific VPCs in our accounts.
I was also impressed with the announcement of their Builders Library. AWS has done great work in providing additional resources in the spirit of community-taught education and enabling us to freely leverage their expertise on how to build secure, reliable, performant, and cost-effective applications in the AWS cloud.
That's a wrap, and to be honest, I'm walking away with more homework than expected. You bet I'm going to rewatch those talks mentioned above, plus a few more on my list. Happy 2019 everybody!