Design for scalability / performance / security, etc.

Most design articles and books focus on obtaining loosely coupled and highly connected software.

The general guide is not to design for performance, but simply write efficient code. In the future, performance can be fixed in hot spots. I speak in terms of rich domain applications.

What about scalability and security? When do you start to consider the thesis criterion? I am not talking about SQL injection at all.

For example: if I have to start serving millions of users, say, several hundred ... When should I put my internal website on the Internet? etc. etc.

+4
source share
7 answers

Safety should always be considered from the very beginning. Taking care of security after this fact is a surefire way to ensure that your application will never have the necessary security. Scalability is something that can be applied later, without unduly breaking the code base, assuming you at least thought about scaling when you were coding (no implementation needed).

+2
source

Security I think this is something that needs to be taken into account from the very beginning, as it may be more difficult to remake security-related items.

I believe that to some extent scalability is similar to performance, where settings can be made later if you need while you focus on writing efficient code in general. But this is not a complicated and quick rule. I would say that depending on your choice of language, some scalability elements should be considered from the very beginning, such as using Session in ASP.NET ...

+1
source

Security is a must for everyone but trivial internal projects. As for scalability / performance / availability, it really depends on the service you are trying to provide. For most sites (almost all sites except true (non-meta) web-scale search engines), performance and scalability are usually not a huge problem and can be simply solved by attaching several load balancers in front of webapp servers with caching (for example, using memcached and etc.) and adhoc data sorting (partition tables by user ID, etc.). The only headache is usually associated with the requirement of high data availability, since most DBMS replication and fault tolerance solutions do not work very well.

If you want to create a search engine for the entire network, you will need to think about performance, scalability from day one, or just spend time and resources.

+1
source
  • Security must be addressed from the start. Optimizing performance is much easier than trying to fix a security hole.
  • Scalability can be taken into account during the design process on which servers, db, software.
  • Efficient code generally has a high level of performance, so this is not so important.
0
source

Scalability is something that cannot be achieved after the development of an entire application. If you need to meet the criteria of millions of hits, you need to think carefully before writing a single line of code.

There is the concept of vertical scaling and horizontal scaling. Vertical scaling can be achieved by increasing the amount of memory and more processing power on the problem. But eventually it will end. You should always plan for horizontal scaling.

Scale it so that you can split your business into High-usage, medium-usage, and low-usage. Then you can use high-performance workflows on different application / website servers. Your high-performance workflow can hit multiple databases. Can you split the load on the base and distribute the load on different fields? If so, also distribute it at the database level. The following is a wonderful description of the eBay architecture, describing how they managed to scale horizontally:

http://www.addsimplicity.com/downloads/eBaySDForum2006-11-29.pdf

0
source

Performance and scalability are never a problem ... as long as it is not a problem. Your general approach to writing good quality code and then working with hot spots is a good plan as you continue to test performance and measure it during development. When working with large projects, there are always functions, functions, features, but without regular performance measurements, which you do not know if the problem is related to adding a certain function. Going back to find the problem in three years is a much more difficult task. So I would say don't worry about performance, but start measuring it as part of your daily / weekly / iterative builds. It will provide you with excellent information and confidence in your code.

0
source

Security from the start, which is hard to change later.

As for performance and scalability, that is, an interview with the creators of stackoverflow on Hanselminutes: http://www.hanselminutes.com/default.aspx?showID=152

I like the pragmatic approach they used with this site, sometimes even making me cringe. For example, they have a database and a web server on the same computer. Yes, and the development server too. But it does work. And setting up another separate DB server is quite simple if there is too much traffic for one machine.

0
source

All Articles