AWS CodeCommit is Amazon Web Service's Git hosting service and part of their suite of developer tools. It can be thought of as a comparable service to something like GitHub or GitLab although being completely honest both of those services are ahead of CodeCommit in terms of usability and feature set. It does however have the advantage of being closely tied to other AWS services, in particular IAM for permission management.
CodeCommit allows for the usual development workflow of Pull Requests. However, something annoying about this is that you can only view outstanding pull requests against a single repository. There is no way in the AWS Console to have an overview of all outstanding pull requests across all code repositories in your account (within the current region).
As someone who is responsible for performing code reviews, I found this quite frustrating in that I couldn't at a
Read MoreA neat feature of Amazon Web Service's API Gateway service is that it can integrate directly with other AWS services. The most common use of API Gateway is to integrate directly with a Lambda function, typically to perform an action like update a DynamoDB table or send a message to an SQS queue. But, sometimes it is not actually necessary to use a Lambda function at all - by taking advantage of API Gateway's AWS service integration, we can avoid this intermediary step altogether and build a much more efficient and resilient architecture.
I recently had a need to provision a simple API endpoint that would accept a JSON payload and store the data to be processed later. Typically I would look to use API Gateway backed by a basic Lambda function to accept the JSON data and store it in an SQS queue. Instead, for this project I opted to use the API Gateway AWS service integration with the SQS
Read MoreI recently wrote an API endpoint for a project at VaultRealEstate which required me to generate a month-by-month commission summary breakdown for a sales agent. The API endpoint accepts an arbitrary start and end month and should return a JSON object showing all the distinct months in that date range and the commission performance for those months (for example, for displaying a bar chart).
I found that this was actually not a trivial problem, and built-in Python libraries like datetime can't really solve this problem nicely. There are third party libraries like dateutil which can solve the problem, and normally I would immediately go for a library like this. However, this project is hosted on AWS Lambda and I'm conscious about the deployment size of the project getting bigger with each dependency introduced, so I really like to only introduce a dependency when it's necessary.
I found
Read MoreSomething awesome about using AWS is the ability to auto-scale your workloads. At the risk of over-simplifying it, you set up a Launch Configuration and an Autoscaling Group to which the Launch Configuration should apply. The Launch Configuration defines the EC2 instance type, security groups, base AMI image, and other attributes for your Autoscaling Group. Once this is all set up, you can have new machines spun up automatically based on the load experienced by your application.
Something not so awesome about using AWS is when you need to update the base AMI image for your launch configuration. For example, you may have installed a new required system package or code library to your application, and you need to ensure that all your instances now and in the future have this package installed. You create a new AMI image, but now what?
It turns out that for some reason AWS does not allow
Read MoreIt's insecure. The data is sent across the Internet via an insecure protocol, FTP. This could have been remedied by using a secure file transfer protocol such as SFTP.
It's inefficient. Every night, we're sending an entire snapshot to the backup server. Obviously, there's a whole lot of data on the server that never changes, so there is really no need to back it up every night. To make things worse, the backup process takes longer and longer each night, as the amount of data on the server increases over time. We're needlessly wasting precious bandwidth with this solution. The ideal solution involves doing a full backup as a baseline, then only sending the changes across in each subsequent backup.
Eventually the FTP account was no longer available, so I started looking in to a better solution. I'd done a lot of work recently with Amazon Web Services, so I decided to investigate
Read More