Why didn't logical programming win?

Seeing what this gives, I see several huge advantages:

  • The best approach to error-free programming. Example. If you need to enable / disable the menu item with imperative programming, you should not only remember in what state this element is enabled, but also do not forget to keep track of all the moments when this piece of code should be executed. In LP, the last (I suppose) is not needed
  • A potentially great way to write a program that starts faster. Since everything is packed in dependencies, the actual code that is needed is only run when it is needed. As a rule, many pieces of code take a lot of time at the beginning, not because it is necessary right now, but because it will be necessary sometimes in the future.
  • Also seems like a great way to automatically apply concurrency. This is due to the fact that if we can track all the dependencies of elements, we can theoretically see that some branches of the graph can be evaluated in parallel.

These are just my assumptions, because I really did not write any program with a logical programming language, but it seemed a very impressive concept. So are there any flaws or my positive things that are not really true in real life?

thanks

Max

+4
source share
4 answers

Speaking of Prolog, with which I had some experience, I found that logical programming is suitable for certain tasks, but very incomprehensible when it comes to debugging an application outside a certain size. This paradigm does not work with certain problems or certain scales of problems.

Prolog is not a general-purpose programming language. It is intended for AI. He has a purposeful goal - and in fact he does not pay attention to other things.

Modern langauge programming has no purpose. It is just that. It is common and equally applicable to most programs as a typical business encounter. This is a huge advantage. C # knowledge translates to many areas, Prolog just doesn't. Writing certain types of applications (say, real-time graphics) would be terribly painful in a logical programming language. The concept itself gives me a terrible headache (seriously).

I don't think logical programming has ever been a competitor. It is always used correctly in a specialized context, and not in general. He is not fighting for popularity.

I don't know if F # is adding anything new to the party. This seems to be quite popular, although I would not know if you would call it strictly logical programming, it seems like a kind of hybrid.

+3
source

The logical conclusion is difficult to understand and effectively implement both in terms of runtime and in terms of memory. Many other simple things (for example, side effects) are difficult to express in a logical language either because of their “pseudo-non-deterministic” execution model (for example, built into backtracking in proog), or rely on unification instead of a simpler functional material model. of.

Logic programming is great for specific applications, but terrible for 90% of everyday programming, which basically moves data and updates certain states.

+3
source

AFAIK, Prolog, and related logical programming languages ​​never died. They are used quite often for some problems, and they are used more often than you can think of in a domain-specific language (for example, to solve a specific problem in an application, mainly written in some other language).

As you noted in the question, logical programming languages ​​are not very suitable for a lot of state-related problems. But equally imperative languages ​​are poorly suited for many problems that are not related to the state.

For me, the question is a bit like the question why yacc did not win. He (and his relatives) really won (or, at least, got a good place) - but only in a specific sports parsing. There are other sports with other winners.

EDIT Perhaps the best comparison is SQL. You did not expect it to replace C, but there are many C programs that use SQL to process database queries. The prologue program is basically a database with a more complex query system - the full version of Turing, but is not intended to be used as a general-purpose language.

+2
source

Well, maybe because he doesn't describe what “hardware” will actually do?

Almost all known languages ​​inherit from C, and these languages ​​are required, since C is required. This was necessary, since at first it was considered some kind of high-level assembler (I’m obviously too fuzzy here, but you understand), which is essentially a list of instructions for the processing unit (in dialect), hardware.

It’s easier to think linearly how the processor does the calculations (not today - there is a lot of internal parallelization to optimize processing, but the general idea of ​​the processor is that it applies your instructions in the order you give it), since a multi-threaded programming method seems like a lot of problems for many developers used to execute a single thread, I assume this is true.

However, I just guess I'm not a specialist at all.

+2
source

All Articles