Just returned from Mountain West DevOps today, where we talked about it, a lot. (Not specifically for Go, but overall.)
To be brief, all I can say is that it depends.
For a simple application that does not get much benefit, you can simply manually deploy the instance, overlay the binary on it, run it, and then do it. (You can cross-compile your Go binary if you are not working on a production platform.)
For a bit more automation, you can write a Python script that downloads and runs the last binary for the EC2 instance for you (using boto / ssh).
Although Go programs are usually fairly safe (especially if you are a test ), you can demonize the binary file (make it a service) so that it starts again if it fails for some reason.
For greater autonomy, use a CI utility such as Jenkins or Travis. They can be configured to automatically run deployment scripts when you pass code to a specific branch or apply tags.
For more powerful automation, you can take it one step further and use tools such as Packer or Chef . I will start with Packer if your needs are not really intense. The Packer developer talked about it today, and it looks simple and powerful. A chef serves several businesses, but may be redundant for you.
In short: the main idea with Go programs is that you just need to copy the binary to the production server and run it. It is so simple. How you automate it or do it reliably is up to you, depending on your needs and your preferred workflow.
Further reading: http://www.reddit.com/r/golang/comments/1r0q79/pushing_and_building_code_to_production/ and in particular: https://medium.com/p/528af8ee1a58
Matt
source share