Table of Contents
IntroductionErik Reinert begins the course by talking about why an important characteristic of DevOps engineers is the ability to identify and break down common problems within an organization and build solutions using standardized practices. The course Introducing DevOps for Developers introduced these concepts, and this course will see them practically applied in enterprise scenarios.
Three WsErik reviews the Three W's of strategy. What problems are we trying to solve? Who are we solving it for? Why are we solving this problem?
Organization ProblemsErik discusses organizational problems solved by DevOps, including role-based access control, domains/DNS, networking, and resource allocation (compute, storage, etc).
Source Control & Service ManagementErik walks through developer problems surrounding source control and service management. The best solutions support developers and enable them to only deal with the source code and the mechanisms around deploying it. A question about handling hot-fixes is also discussed in this lesson.
Continuous Delivery & Continuous IntegrationErik discusses the importance of developer empowerment in DevOps practices and the responsibility of different team members when something goes wrong, emphasizing the need for developers to take ownership of their services, configurations, and deployments. A real-world example of how empowering developers led to more frequent and efficient deployments is also included in this lesson.
Logging & MetricsErik compares the logging responsibilities of DevOps with software developers. Developers need access to logs and metrics able to access logs for monitoring and troubleshooting. DevOps provides access to the logs, but it's the developer's responsibility to check them.
Introduction to GitOpsErik introduces GitOps, explaining that it is a paradigm or practice that emphasizes using Git as a single source of truth for declarative infrastructure and applications. The benefits of GitOps, including infrastructure as code, automated deployments, reproducibility, rollbacks, and enhanced collaboration, are also discussed in this lesson.
Terraform & PulumiErik compares automation tools Terraform and Pulumi. Terraform is an industry-standard tool for declarative infrastructure automation. Pulumi takes an imperative approach using programming languages for infrastructure configuration. DevOps engineers should choose a tool that aligns with their specific needs and preferences rather than following trends.
Automation Tools Q&AErik answers questions about testing frameworks, workflow differences, syntax variations, migration challenges, and the sustainability of using multiple languages with Pulumi. Maintaining a single source of truth is critical for effective infrastructure management.
Automation ProvidersErik discusses the importance of automation providers to streamline infrastructure management, improve collaboration, and lower the cost of managing the infrastructure. Automation providers require financial investment, but the investment frees up time and resources that would normally be required to build the same infrastructure management. to address more critical business challenges. Erik also provides insights on choosing the right automation provider by focusing on solving specific problems, leveraging existing tools, and prioritizing essential features for your organization's needs.
Automation Providers Q&AErik answers questions about automation providers including the role of tools like Ansible in infrastructure management and managing permissions and keys.
Continuous Integration & Delivery CI/CDErik compares Continuous Integration (CI) and Continuous Delivery (CD). CI involves automating code integration from multiple contributors into a single project and validates changes. CD focuses on delivering software in short cycles, getting it into production as quickly as possible. While CI and CD are often used together, they are not the same thing, and mixing the two should be done carefully.
CI/CD Q&AErik answers questions about CI/CD processes and advises against relying on manual scripts. The benefits of using cloud or CI/CD provider tools to run containers is also discussed in this lesson.
Self-Hosted vs Cloud ProvidersErik addresses common misconceptions about self-hosting being easier or cheaper than cloud hosting, highlighting the complexities and maintenance involved in self-hosted solutions. The decision between self-hosting and using cloud providers should be based on an organization's specific needs, resources, and confidence in managing infrastructure.
Infrastructure SetupErik discusses the service providers used throughout the course, including Doppler for secret management, GitHub Actions for CI/CD, Terraform for automation, and AWS as the chosen cloud provider. A question about secrets management is also discussed in this lesson.
Terraform Cloud ComponentsErik introduces the infrastructure automation process with Terraform components. A modular approach to managing various infrastructure components with a GitOps workflow adds flexibility and allows for easy scalability and updates across multiple environments.
Developer Service ComponentsErik outlines the logical separation of developer service components. This structure gives developers access to their code repositories and deployments while allowing DevOps to manage the workspaces. The workshop materials repositories are also provided and contain the final solutions and branches for each step in the process.
Creating a Doppler ProjectErik introduces Doppler, a tool that simplifies the management of environment variables and secrets. Doppler allows users to organize secrets based on projects, workspaces, and configurations, providing a convenient way to store and access secrets securely. A Doppler project is created and the Doppler CLI is demonstrated.
GitHub ConfigurationErik walks through creating an SSH key for a GitHub account. A fine-grained personal access token is also added to GitHub and configured so Terraform can manage repositories. Secrets for GITHUB_TOKEN and GITHUB_OWNER are then added to the Doppler project.
Terraform Cloud SetupErik sets up a Terraform Cloud account and generates an API token for accessing the Terraform Cloud API. The API token is added to Doppler.
AWS SetupErik creates an IAM user and administrator group in Amazon Web Services (AWS) to ensure secure and organized access to AWS resources. Using an account other than the root account avoids security risks. Account aliases are also explained in this lesson.
AWS Security Credentials & Doppler CLIErik creates an AWS access key, which grants access to the AWS CLI. The access key is stored in Doppler with the other secrets. The Doppler CLI is set up and tested.
Creating a Managed RepositoryErik creates a private GitHub repository for managing Terraform automations. The repository can be cloned to your computer using Git or the GitHub CLI. Instructions for installing and setting up the GitHub CLI are also included in this lesson.
Adding ProvidersErik explains the providers enabled DevOps to work with multi-cloud and connect to other services within the same automation. The Terraform registry is similar to NPM and contains several providers from popular services. Project and Workspace modules are added to the main.tf file.
Using the Terraform CLIErik uses the Terraform CLI to initialize the repo. The validate, plan, and apply commands are also introduced.
Migrating to Terraform CloudErik migrates the state to Terraform Cloud and removes the local state from the repository. Doppler is used since the authentication keys are required to connect to Terraform Cloud.
LocalsErik explains how Terraform's dynamic configurations streamline the creation process of multiple workspaces and projects. Rather than copying and pasting configurations, locals are used to create variables for projects and workspaces.
Iterating Through LocalsErik uses a for loop to iterate over local variables to generate the dynamic configurations. Values are accessed through the "each" property. Accessing project-specific variables are also demonstrated in this lesson.
Moving ResourcesErik uses a moved meta-argument to prevent resource deletion during updates. Once the change has been planned and applied, the moved meta-argument can be deleted.
Renaming the WorkspaceErik renames the workspace to clarify its purpose and uses the moved meta-argument to update Terraform. The separation and visualization of state are also discussed in this lesson.
Updating the TFE RepoErik commits the local changes in the fem-eci-terraform-tfe repository and pushes the changes up to GitHub.com.
Configuring a Version Control ProviderErik connects Terraform Cloud to a version control system (VCS) like GitHub. A variable representing the GitHub installation ID is added to a ata file so it can be added to the workspace.
Adding a VCS Repo IdentifierErik validates the changes using Terraform. The repo is then planned and applied. Since Terraform Cloud will be automating the workflow, the execution mode is changed to "remote".
Creating a Variable SetErik finishes configuring the GitOps workflow. A variable set is created to store the TFE_TOKEN so Terraform has access to run the automation. The Terraform UI is used to re-run the last operation.
Creating a GitHub Automation RepoErik discusses the separation of automation workspaces for Terraform Cloud and GitHub in a GitOps workflow. A new GitHub repository is created to automate the creation of additional code repositories. Questions about using branches instead of repositories are also discussed in this lesson.
Configuring a GitHub WorkspaceErik configures a Terraform automation for the GitHub workspace. Once the workspace is configured, it is committed and pushed to the TFE repo, and Terraform plans the automation. Once the plan is completed, the changes can be applied in Terraform Cloud.
Using the Repository Terraform ModuleErik uses the repository module to set up the GitHub workspace repository. To reduce redundancy, a repos property is added to the locals file, and the repository module will use a for_each loop to create and manage each repository.
Adding Repository to Terraform CloudErik creates a backend.tf file so the new repo can be added to Terraform Cloud. Terraform plan and apply commands are run for the configuration. Note: The repos will need to have public visibility if a GitHub Pro account is not being used.
Committing and Pushing UpdatesErik completes the full automation and synchronization in Terraform. Environment variables are added in Terraform Cloud for the GitHub token and GitHub owner. This robust GitOps approach enhances the ability to manage and automate infrastructure effectively.
GitHub Automation Q&AErik responds to questions about deleting/archiving repositories and creating global variable sets for tokens.
Creating AWS Automation ReposErik uses the GitHub automation repo to create AWS automation repos for networks and clusters. With branch protection enabled, the changes are added to a branch, and a PR is created. Merging the PR will apply the changes in Terraform Cloud.
Configuring an AWS NetworkErik configures an AWS network automation. The configuration code is copied from the example repo, and the options are discussed including subnet address space calculations and the organization of variables.
Configuring an AWS ClusterErik configures the AWS cluster automation and discusses features including domain, environment, instance type, market type, and VPC name. Templates offer structured naming conventions and add ease to automating what would be a complex process using the AWS UI.
Adding AWS Repos to WorkspaceErik adds the AWS repos to the Terraform Cloud workspaces. The changes cannot be applied yet because the AWS credentials are not added to the workspace.
Creating AWS Variable SetsErik creates a variable set for the AWS network and cluster workspaces. This variable set provides the AWS credentials to Terraform Cloud so that any required networks and clusters can be created in AWS without the user needing to use the AWS UI.
Generating the AWS ServicesErik generates the AWS services and uses the AWS UI to highlight what was created by Terraform Cloud. SSH key pairs are still required to finish the automation. They can be generated from EC2 interface.
Service Infrastructure Automation
Creating the Service RepoErik focuses on service infrastructure automation. One workspace will be configured to work with multiple cloud providers to simplify the architecture. The service repository is created and represents a service created by the development team
Creating the Product Service RepoErik creates the automation repository for the service infrastructure. This repository deploys the service repos created by developers.
Deploying a ServiceErik walks through deploying the service repository to AWS. When code and docker images are committed in the developer repositories, a GitHub action connects to AWS and pushes the image to ECS.