Every day at Logicworks, engineers work to keep our hundreds of customer infrastructure systems up and running. But equally important is the work of Dakota Blair, Senior Software Engineer, who works behind the scenes to develop software that creates cost, security, and incident reports that help our customers monitor and manage their systems.
We sat down with Dakota this week to get a glimpse of what it’s like to be a software engineer at Logicworks. If you’re interested in joining our team or learning more about Logicworks, visit our careers page or contact us.
What software do you develop for the cloud?
I develop software that creates custom reports for our customers, which our customers and engineers use to make cloud management decisions.
For example, I can create a custom billing report that outlines what a customer is paying for each service on AWS, or what each team or engineer is spending, and then compare that with historical data to make future projections. Our engineers then use that data to develop a comprehensive cost optimization plan, such as scheduling instance shutdowns during off-hours or purchasing lower-cost instance types.
How did you get into software engineering?
I’m a mathematician by training and got my PhD in Mathematics from the CUNY Graduate Center in 2015. I’ve always walked the line between software engineer and mathematician; while studying for my PhD, I worked as a software engineer at New York Magazine and Freelancers Union.
Math helps me think about problems correctly. Software allows me to execute math problems that I wouldn’t be able to do alone. I can use math to create diagrams for how processes should work together, and can apply the same mindset to computing problems such as analytics, making use of map-reduce, machine learning and other tools.
Why is this type of data reporting important for cloud users?
AWS provides great built-in monitoring tools and dashboards, but unless you’re constantly looking at the AWS console, you could miss important events or trends. What we do is pull together the right information and subject it to human intelligence — so that our customers react to important information and filter out inconsequential data.
For most of our customers, the most important reports I produce are around cloud costs. We use data from a variety of sources, including both AWS and third party tools to create these reports. AWS customers not only want to cut costs, they also just want to make sure that they’re getting the most out of each dollar. Efficient cloud spending can add value to a client’s bottom line.
In the end, there is nothing worse than getting your AWS bill and manually tracking back expenses to associated projects. In addition to being inefficient and time-consuming, it prevents the feedback loops between business and IT related to where to invest budgets or refocus priorities.
What options exist for AWS cost optimization?
Here are a few of the cost optimization tasks that Logicworks generally helps AWS customers with.
- Identifying and eliminating wasted resources
- Purchasing Reserved Instances or planning for Spot Instances
- Scheduling shutdowns (especially for dev / test environments)
- Automating infrastructure build-out (this helps companies launch consistent environments
- Setting up billing alerts
- Tagging resources by team / project / app for long-term tracking
But of course, cloud cost optimization does not end in this series of technical tasks. Tracking AWS costs is complicated, especially when you have several major business units, distributed teams, and dozens or even hundreds of projects. In larger companies, it also requires clear communication channels between business and IT.
If you have any questions about AWS cost optimization, please reach out to a cloud expert at Logicworks.