Yes, Google will penalize duplication of content no matter where you placed it - across several domains of a country, different subdomains, different subdirectories, whatever.
Building an Alex Example:
http://www.example.com/en-us/ http://www.example.com/en-au/
Most likely, en-us and en-au will be almost duplicated if the content is the same, and American English is no different from AU English.
Solution # 1: Using the Robot Meta Tag
What you can do is set the meta tag of robots in the header section of all secondary pages so that Google knows to skip these pages. The meta tag is as follows:
<meta name="robots" content="noindex,follow"/>
This tells Google not to index the content of the page, but still follows any available links.
Following the example of Google, you must create the main domain (in the Google case, google.com), which will check the location of the visitor and redirect accordingly. The primary domain must be indexable, while the secondary domains (redirection destination) must not be indexable if it uses the same language as the primary domain (thus duplicates).
Decision No. 2: indicate canonical
Another alternative is to use a canonical hint to tell Google that the same page is actually accessible from multiple URLs.
The Carpe diem for any duplicate content is troubling: we now support a format that allows you to publicly indicate your preferred version of the URL. If your site has the same or very similar content accessible through multiple URLs, this format provides you with more control over the URL returned in the search results. It also helps to ensure that properties, such as link popularity, are combined with your preferred version.
Better read about it from the Google Webmaster Central Official Blog . :)