Why make before make install

I know the installation process from source code.

  • ./configure
  • make
  • make install

but why "do" before "make install", why not just do "make install"

My understanding so far is that "make" only compiles the source code into an executable file, and "make install" actually puts them in the PATH executable, am I correct?

Perhaps I did not make my question clear and regret it.

let's say we want to install an executable on a machine, can we just do

  • ./Configure
  • make install

instead of the three steps shown above.

+7
source share
4 answers

When you run make , you instruct it to essentially follow a set of build steps for a specific purpose. When make is called without parameters, it launches the first target, which usually just compiles the project. make install displays the make install target, which usually no more than copies binary files to its destination.

Often, the install target depends on the compilation target, so you can get the same results just by running make install . However, I can see at least one good reason to take them in separate steps : sharing privileges.

Usually, when you install your software, it goes to places for which ordinary users do not have write access (for example, /usr/bin and /usr/local/bin ). Therefore, often you have to run make , and then sudo make install , because the installation phase requires escalation of privileges. This is "Good Thing ™" because it allows you to compile your software as a regular user (which actually matters for some projects), limiting the amount of potential damage to the bad behavior procedure and only gaining root privileges for the installation phase.

+10
source

make without parameters accepts ./Makefile (or ./makefile ) and creates the first target. By convention, this may be the purpose of all , but not necessarily. make install creates a special purpose, install . By convention, this takes make all results and installs them on the current computer.

Not everyone needs to make install . For example, if you are creating a web application for deployment on another server or using a cross-compiler (for example, you are creating an Android application on a Linux machine), it makes no sense to run make install .

In most cases, a single line ./configure && make all install will be equivalent to the three-step process you described, but it depends on the product, on your specific needs, and again this is only by agreement.

+3
source

There are times when I want to try to compile code changes, but not deploy these changes. For example, if I hack into the Asterisk C code base and I want to make sure that the changes I make are still compiling, I save and run make. However, I do not want to deploy these changes because I have not finished coding.

For me, running make is just a way to make sure that there are not too many compilation errors in my code where it is difficult for me to find them. Perhaps more experienced C programmers do not have this problem, but for me, limiting the number of changes between compilers helps reduce the number of possible changes that can completely ruin my build, and this makes debugging easier.

Finally, it also helps me stop. If I want to go to dinner, I know that someone can restart the application in its current working state without having to look for me, since only make install copies the binaries to the actual application folder.

There may well be other reasons, but this is my reason for understanding that the two teams are separated. As others have said, if you want them to be combined, you can combine them with your shell.

+1
source

A lot of software these days will do the right thing, just make install . In those that do not, the installation goal is not dependent on compiled binaries. Therefore, to play in a safe place, most people use make && make install or a variant of it just to be safe.

0
source

All Articles