Makefile is “make file“. It has been abused into being a task runner. The wrong tool for the job.
The “make file” is all about file dependencies based on last modified date. Outdated target files can be rebuilt using source file. It’s is dependency management and the essence of an incremental compiler, but all revolving around files, not tasks.
dima55 3 hours ago [-]
Right. It kinda sucks for that purpose too, which gives Make a bad name.
kristopolous 3 hours ago [-]
What gives make a bad name is the same thing that gave javascript or m4 a bad name - these things are their own exotic birds - doing them well require new concepts and new behaviors.
You can indeed shoehorn them into what you know but really you need to fully embrace their weird world.
See also forth, dc, awk, jq ...
It'd be nice to have a dedicated crash course on these things for people who understand conventional programming and have been doing the normal stuff for a number of years.
Also see supercollider, prolog, haskell, apl...
I think the most mainstream exotic bird people learn is Lisp. Doing all these things well is as different as Lisp is from say conventional python.
anthk 3 hours ago [-]
Forth it's far easier than dc.
On Lisp, exotic? it's damn easy. Haskell it's far worse.
jonhohle 2 hours ago [-]
It may suck for it, but it’s better than a collection of random scripts and commands baked into CI configuration that evolve to become unrunnable in normal dev environments.
tgv 4 hours ago [-]
The nice thing about make is that is ubiquitous, and that it offers nice things out of the box.
paulddraper 2 hours ago [-]
It works fine for PHONY targets.
But most people don’t realize in many cases they can do better than that.
kiitos 4 hours ago [-]
Consolidation of .PHONY targets is an anti-feature, the .PHONY decl is supposed to be adjacent to its target...
pjmlp 12 hours ago [-]
I am quite certain to have used such kind of tooling during 1990's, with ads on The C/C++ Users Journal and Dr. Dobbs developer magazines.
JdeBP 7 hours ago [-]
It wouldn't have been Matthias Andrée's makel, then.
It hasn't taken quite the 50 years that we are told, has it? (-:
rednafi 6 hours ago [-]
Thanks for the tool. This is pretty neat.
It’s almost comical to see “why Python” comments after all these years. I would’ve chosen Go to write this, but that’s beside the point.
Yes, Python installation is tricky, dependency management is a mess (even with uv, since it’s not standard tooling; another one will pop up), and performance is atrocious. But even then, some Python libraries have bigger communities than the combined communities of all these “better, faster, more awesome” languages.
Python is here to stay. Newcomers love it because you need to know so little to get started. People don’t care about the little quirks when they begin, and eventually they just live with the warts. That’s fine. LLMs write better Python than Go (my preferred language, or whatever yours is). And if you know anything about the AI research community, it’s C++, C, Python or GTFO.
Going forward, a lot more tools will be written in Python, mostly by people entering the field. On top of that, there’s a huge number of active Python veterans churning out code faster than ever. The network effect keeps on giving.
So whatever language you have in mind, it’s going to be niche compared to Python or JS. I don’t like it either. But if languages and tools were chosen on merit instead of tribalism, we wouldn’t be in this JS clusterfuck on the web.
callc 5 hours ago [-]
I love python, have used it for years. I hate the dependency and multiple interpreter situation.
A great PL should stand on its own without the need for external tooling.
At this point I have given up on python except for if it’s a little script that only uses standard libraries. Otherwise I’m choosing a compiled language.
I use python without any dependencies on web servers. Pip is cool, but you don't need to get pulled into the node-like dependecy hell.
For example instead of requests, you can use http.client, instead of flask, http.server, or socket.tcpserver, or just socket. If you want sqlite, don't jump to pip install sqlite or whatever, use sockets to talk to it.
jonotime 1 hours ago [-]
How do you use only sockets to talk to sqlite?
blks 4 hours ago [-]
it’s harder to distribute software written in python via eg package manager compared to compilable languages.
kiitos 5 hours ago [-]
> It’s almost comical to see “why Python” comments ... Yes, Python installation is tricky, dependency management is a mess (even with uv, since it’s not standard tooling; another one will pop up), and performance is atrocious. But ... Newcomers love it because you need to know so little to get started. People don’t care about the little quirks when they begin, and eventually they just live with the warts.
I'm not sure if this is news to you or if you already know it, but, just to be explicit -- you know that the overwhelming majority of end users aren't gonna have `pip` installed on their systems, right? And that any project with "Installation instructions" that begin with a `pip` command aren't really gonna work in the general case?
Just wanna make sure that's well-understood... it's fine if you wanna build a tool in Python, but if you expect it to be practically usable, you need to do distribution of binaries, not `pip` targets...
rednafi 5 hours ago [-]
This point has been pummeled to death for decades. Before Python, people did the same with Ruby and “gem.” Literally nothing is new here.
One of the reasons I write my tools in Go is exactly this. But if the tool was written in Go, people would complain about why not Rust and such. The point wasn’t to convey that Python doesn’t have its fair share of flaws, but to underscore that the HN crowd doesn’t represent any significant majority. The outside world keeps on using Python, and the number of Go or Rust users is most likely less than PyTorch or Scikit-learn users.
Shipping Python is hard and the language is slow. Also, tooling is bad. The newfangled ones are just a few in the long stream of ad hoc tooling over the past 20 years. Yet people write Python and will continue to do so. JS has a similar story, but it’s just a 10x worse language than Python.
kiitos 5 hours ago [-]
Let me be even more explicit: if your installation instructions are `pip install ...` -- or `npm install ...` for that matter -- then you are automatically excluding a super-majority of potential users.
rednafi 4 hours ago [-]
I don’t even write python these days. I just wrote my own version of a terminal llm-caller[^1] in Go for this exact same reason.
There’s a famous one that does the same thing but is written in Python. So it has its issues.
My point is, pip exists in most machines. pip install sucks but it’s not the end of the world. HN crowd (including myself) has a tendency to beat around the bush about things that the majority don’t care about IRL.
some Makefiles use indents or var placement as semantic cues. if a tool rewrites them mechanically, it might clean things while killing meaning. is structural correctness enough, or do we need formatters that preserve human context too?
iib 8 hours ago [-]
Ideally, we'd have linters that preserve the human context as well. But human context may be too ambiguous and high variance enough that it can be impractical.
It's hard to say what's intent and what not, maybe linters with many custom rules would work best.
jchw 8 hours ago [-]
That doesn't sound any different than it is for any other programming language, but many people prefer automatic formatting anyways.
s4i 12 hours ago [-]
Does this support inline ignoring specific rules with some syntax? Couldn’t find this from the README. Would be good to have as an escape hatch.
teo_zero 2 hours ago [-]
If only it accepted POSIX syntax...
foma-roje 13 hours ago [-]
It really doesn’t have to be complicated for it to be useful. Plenty thanks for sharing this.
seanwilson 5 hours ago [-]
Anyone else not bother maintaining a list of .PHONY targets? Always felt like a chore that adds noise for a rare edge case.
leetrout 16 minutes ago [-]
> The implicit rule search (see Using Implicit Rules) is skipped for .PHONY targets. This is why declaring a target as .PHONY is good for performance, even if you are not worried about the actual file existing.
Differences to checkmake, the older Makefile's linter and formatter?
1oooqooq 11 hours ago [-]
ewwww. consolidated phony lines. everyone knows these should be right before each rule declaration.
leetrout 7 hours ago [-]
Seems like from the README this can be disabled
group_phony_declarations = false
I think the visual clutter of .PHONY on each recipe declaration is better since there’s always a lot of copy-paste coding.
Crier1002 13 hours ago [-]
ive always wanted this. im going to give it a go!
does this happen to support IDE like vscode?
antonhag 11 hours ago [-]
From the readme:
VSCode Extension
1. Open VSCode
2. Go to Extensions (Ctrl+Shift+X)
3. Search for "mbake Makefile Formatter"
4. Click Install
Crier1002 10 hours ago [-]
thanks! apologies i was on mobile and missed this. im excited to try it out
eabeezxjc 7 hours ago [-]
simple use rake
notnmeyer 7 hours ago [-]
or any other task runner
waqar144 12 hours ago [-]
Uh, why Python?
legends2k 11 hours ago [-]
Why not Python? I primarily program in C++ but I see it as a decent choice as Python is available in almost all recent machines. Of course Windows is a notable exception but given it's a tool for developers I guess Python should be present.
IshKebab 11 hours ago [-]
1. Terrible performance.
2. Terrible installation UX.
The number of issues we've had with pre-commit because it's written in Python and Python tooling breaks constantly...
In fairness, the latter point may be finally solved by using `uv` and `uv tool install`. Performance is still a major issue though. Yamllint is easily the slowest linter we use.
(I'm tempted to try rewriting it in Rust with AI.)
mcswell 4 hours ago [-]
> 1. Terrible performance
Performance only matters if you're doing something compute- or disk-intensive, and then only if the libraries you're using are Python all the way down. (AI programming, at least the kind that most of us do--I don't know about places like OpenAI) is generally done with Python using libraries that use some compiled language under the hood.
And in this case--a linter--performance is almost certainly never an issue.
IshKebab 1 hours ago [-]
The only thing computers do is compute and disk.
Performance only matters if you care about performance, and I do care about performance. If you don't, fine I guess.
turtlebits 5 hours ago [-]
Then remove it? There's always tradeoffs adding tooling - I'm assuming you have it in your workflow to catch downstream issues because it saves more time in the long run.
viraptor 10 hours ago [-]
It definitely is a problem when the tool you're going to use a few times a week takes an extra hundred milliseconds compared to a native solution. Especially when you need to process huge data files like hand crafted makefiles. I can totally feel your pain - extra effort would've been made to avoid that at the cost of development speed. /s
justinrubek 8 hours ago [-]
I find that writing anything substantially complex in python sacrifices the development speed. That isn't its strong suit. It's that a lot of people want to write their code in it by preference.
IshKebab 7 hours ago [-]
Yeah if only it was an extra 100 milliseconds a few times a week. We have yamllint (also written in Python) in our pre-commit (also written in Python) and it definitely adds a second or two.
Also format-on-save is a common workflow.
smusamashah 9 hours ago [-]
Terrible portability across platforms specially with dependencies.
kiitos 5 hours ago [-]
`pip install ...` is not a reliable or appropriate mechanism for distribution of any kind of tool like this one. Table stakes is pre-compiled architecture-specific binaries.
exe34 12 hours ago [-]
Presumably because the author is comfortable with python and it is easy to do string manipulation with.
johnisgood 10 hours ago [-]
Perl is much faster than Python, and it is especially good for string manipulation. Thus, I would have chosen Perl.
a2128 9 hours ago [-]
I would've chosen Java because it's faster than Python and is good for string manipulation. My cousin would've chosen Brainfuck because he's really good at it. Alas, this discussion is useless because none of us are the one who spent the effort to write the Makefile formatter and linter, we can only bikeshed all day about what decisions we would've taken
johnisgood 8 hours ago [-]
For the 100th time, I was responding to "[...] python and it is easy to do string manipulation with.". Perl is way better to do string processing in than Java, too, FWIW.
My comment has absolutely nothing to do with this project or its author, nor the language he has chosen. See the other comments in this thread.
WJW 10 hours ago [-]
Almost everybody knows python these days, almost nobody knows Perl. It's not weird that OP chose a language they already knew.
johnisgood 10 hours ago [-]
I am not saying it is weird, I was just responding to parent with regarding to "string manipulation", and someone mentioned "performance", so I stated two facts about Perl.
I do not care whether or not this project is written in Python. Sure, he chose Python because he is more familiar with it. That is fair enough to me.
elteto 8 hours ago [-]
Perl is absolutely not faster than Python, not even for regexes.
johnisgood 6 hours ago [-]
It really is though. When I was on the journey to find the fastest interpreted scripting languages, Perl and LuaJIT were one of the fastest ones, meaning Python is slower than both of these languages.
rurban 7 hours ago [-]
Perl is always faster than python. Python is just absurdly bad in performance
exe34 10 hours ago [-]
Would you be willing to share the link to your repo?
johnisgood 10 hours ago [-]
I have not written a Makefile formatter and linker in any languages.
exe34 10 hours ago [-]
Yes, I thought so. It's much easier to criticize than to create.
johnisgood 10 hours ago [-]
It does not invalidate anything I have said, and this reasoning of yours is so flawed.
For example: I thought this or that music or movie sucked. I do not need to know how to make a song or a movie to be able to criticize it, let alone have one in a similar vain, same with books. I can criticize a book, or an article, without having written one myself on related topics.
All that said, where did I criticize? I did not criticize anything, at all.
I stated facts. Perl is indeed faster than Python, and Perl was indeed made with string manipulation in mind. I made no comment about this project or its author, thus, it was not a criticism of any sort.
exe34 9 hours ago [-]
Do you regularly offer your directorial opinions to movie makers and singers? Or is it just programmers?
johnisgood 8 hours ago [-]
I do not care about movie makers and singers in general, and for the most part I do not have direct contact with them, so it would be futile to offer any advice. I did offer advice to a couple of singers before though. What is your point besides being unnecessarily defensive over two simple, stated facts? As I said, it was not a criticism of the author or the project, it was a response to your comment. Since this is going to lead nowhere, I am going to stop responding to this thread.
pjmlp 6 hours ago [-]
I will remember that the next time some voices an opinion at the sports bar.
johnisgood 2 hours ago [-]
Remember, you cannot criticize (even though it was not it) unless you have something to show up! Next time someone provides a critique to an article, we have to make sure to let them know it is wrong to criticize unless they have written an article themselves on the same topic.
FWIW it really was just about his comment, and I made two statements: Perl is faster than Python, and that Perl is especially good for string manipulation. I do not mind that he chose Python, good for him.
exe34 28 minutes ago [-]
For somebody who does not mind that the author chose python, you defended your opinions robustly.
x3n0ph3n3 13 hours ago [-]
I don't understand why people like make so much. Interacting with CLI tools using only env vars as arguments is cartoonishly bad dev experience.
zelphirkalt 12 hours ago [-]
Make allows you to specify dependencies for you targets, which are also targets. As such you do not need to rely on brittle string concatenation approaches. It is a tool build for this.
I personally like going to a project folder and run "make run", no matter what language or setup I have, to run the project. It enables me to unify access to projects.
I also take great care to make these runs reproducible, using lock files and other things of the ecosystems I am using, whenever possible. I work on another machine? git clone, make run. Or perhaps git clone, make init, make run.
georgyo 13 hours ago [-]
I'm not so sure most people would agree with you. Though I think plenty would.
I dare say that developers like environment variables more than before. Consider that Docker images, and hence Helm charts, are entirely controlled via environment variables. These very popular dev tools suffer from the same problem of having near-zero easy discoverability of what those environment variables might be. Yet they are very popular.
But I don't think Make usually uses all that many environment variables. You're usually specifying build targets as the command line arguments. Automake and autogen usually generate these makefiles with everything hard-coded.
Also, it makes it very easy to get started with, and it is universally available. Makes it very easy to like.
danlitt 11 hours ago [-]
Make is in POSIX, so it's generally available. Same reason people write shell scripts (even if the scripts are not generally POSIX-only).
bshacklett 8 hours ago [-]
Unless your company forces you to use Windows, which is still much more common than many would like to admit. And yes, WSL exists, but in my experience, if a company is unwilling to allow macOS, there’s a good chance they either don’t allow enabling HyperV, or the security software they use is such garbage that it results in a HyperV enabled system being effectively unusable.
pjmlp 6 hours ago [-]
Windows 11 requires Hyper-V turned on, virtualization based security is one of the reasons of the forced hardware upgrades.
cerved 11 hours ago [-]
I like it because it's language and tooling agnostic, declarative, fast and ubiquitous.
Where it's less great is complicated recipes and debugging
xigoi 8 hours ago [-]
Make is not language agnostic; it has implicit rules for compiling C.
pheggs 8 hours ago [-]
it also has implicit rules for other languages, why would that make it non-agnostic?
I mean there's really not much difference between "VAR=val make x" and "make x VAR=val" now is there?
marginalia_nu 13 hours ago [-]
Syntactically? No. Semantically? Yes.
lmz 10 hours ago [-]
I'm guessing the syntax was the part the poster was complaining about when they complained about the "dev experience".
marginalia_nu 8 hours ago [-]
Dunno, there are other aspects of environment variables that deteriorate the dev experience. They're very conducive to spooky action at a distance, since they're silently being passed along from parent-process to child-process (except when they aren't).
They can cause a lot of annoying bugs, and sometimes it's hard to track down where they are coming from (especially when dealing with stuff running in containers).
77pt77 4 hours ago [-]
> except when they aren't
Like sudo for example.
So many problems related to that.
motorest 10 hours ago [-]
> Interacting with CLI tools using only env vars as arguments is cartoonishly bad dev experience.
Make excels at what it's design to do: specify a configurable DAG of tasks that generate artifacts, execute them, and automatically determine which subgraph requires updates and which can be skipped by reusing their artifacts.
I wonder: which tool do you believe does this better than Make?
But the Internet’s make mind-share means you still have to know make.
Edit: and make lets you use make to essentially run scripts/utils. People love to abuse make for that. Can’t do that with tup.
motorest 7 hours ago [-]
> Tup.
I don't think Tup managed to present any case. Glancing at the page, the only conceivable synthetic scenarios where they can present Tup in a positive light is built times of > 10k files, and only in a synthetic scenario involving recompiling partially built projects. And what's the upside of those synthetic scenarios? Shaving w couple of seconds in rebuilds? That's hardly a compelling scenario.
aDyslecticCrow 9 hours ago [-]
Abuse? Runnig linters, code analysers, configuration tools, template engines, spellcheckers, pulling dependencies, building dependencies with different build systems.
Sufficiently complex project need to invole alot of wierd extra scripts, and if a build system cannot fulfil it... the n it needs to be wrapped in a complex bash script anyway.
5 hours ago [-]
kiitos 5 hours ago [-]
> Tup
`tup` relies on a stateful database, which makes it incomparable to `make`.
PhilipRoman 11 hours ago [-]
You don't have to write Make invocations by hand... It's just a tool that can be called from any editor or IDE (or by automatic file watchers). Environment variables aren't really relevant to Make either, unless you really want to misuse it as a command runner.
aboardRat4 11 hours ago [-]
Because make is a prolog in disguise.
hiAndrewQuinn 13 hours ago [-]
I often prefer to work in in extremis environments where there is no internet access, and hence, no easy way to get ahold of make; it's given me a bad habit of just waiting a build.bash script to do what make does most of the time. I haven't really found myself missing it that much.
0points 13 hours ago [-]
If you can install bash on your airgapped dev box, why wouldn't you install make on it? Make is part of the core dev environment on just about every disto under the sun.
77pt77 4 hours ago [-]
most minimal setups nowadays have bash, make and even perl.
M0r13n 13 hours ago [-]
I am confused, because this means that you won't be able to install anything. No compiler, no 3rd party libraries and no text editor that isn't preinstalled
globular-toast 12 hours ago [-]
It's 80% of what you want and it's installed everywhere.
You could go for something closer to exactly what you want, but now you've got an extra set up step for devs and something else for people to learn if they want to change it.
I would say if you're looking for cli args then you shouldn't be using any wrapper like make at all. Just call the underlying tool directly. Make is for doing big high level things in the standard way, nowadays quite often in CI pipelines.
zelphirkalt 12 hours ago [-]
Yep, that's how I used it on the job before. "make test" would run tests locally and in CI pipeline, keeping the CI file refreshingly short at that point.
13 hours ago [-]
gjvc 10 hours ago [-]
you don't do it naked, you write and use wrapper scripts to make it ergonomic
The “make file” is all about file dependencies based on last modified date. Outdated target files can be rebuilt using source file. It’s is dependency management and the essence of an incremental compiler, but all revolving around files, not tasks.
You can indeed shoehorn them into what you know but really you need to fully embrace their weird world.
See also forth, dc, awk, jq ...
It'd be nice to have a dedicated crash course on these things for people who understand conventional programming and have been doing the normal stuff for a number of years.
Also see supercollider, prolog, haskell, apl...
I think the most mainstream exotic bird people learn is Lisp. Doing all these things well is as different as Lisp is from say conventional python.
On Lisp, exotic? it's damn easy. Haskell it's far worse.
But most people don’t realize in many cases they can do better than that.
* https://git.maandree.se/makel
Or unmake.
* https://crates.io/crates/unmake
Or checkmake.
* https://github.com/checkmake/checkmake (https://news.ycombinator.com/item?id=32460375)
Or make-audit.
* https://github.com/david-a-wheeler/make-audit
Or the Sublime linter for makefiles.
* https://github.com/giampaolo/SublimeLinter-contrib-makefile
It hasn't taken quite the 50 years that we are told, has it? (-:
It’s almost comical to see “why Python” comments after all these years. I would’ve chosen Go to write this, but that’s beside the point.
Yes, Python installation is tricky, dependency management is a mess (even with uv, since it’s not standard tooling; another one will pop up), and performance is atrocious. But even then, some Python libraries have bigger communities than the combined communities of all these “better, faster, more awesome” languages.
Python is here to stay. Newcomers love it because you need to know so little to get started. People don’t care about the little quirks when they begin, and eventually they just live with the warts. That’s fine. LLMs write better Python than Go (my preferred language, or whatever yours is). And if you know anything about the AI research community, it’s C++, C, Python or GTFO.
Going forward, a lot more tools will be written in Python, mostly by people entering the field. On top of that, there’s a huge number of active Python veterans churning out code faster than ever. The network effect keeps on giving.
So whatever language you have in mind, it’s going to be niche compared to Python or JS. I don’t like it either. But if languages and tools were chosen on merit instead of tribalism, we wouldn’t be in this JS clusterfuck on the web.
A great PL should stand on its own without the need for external tooling.
At this point I have given up on python except for if it’s a little script that only uses standard libraries. Otherwise I’m choosing a compiled language.
Some more thoughts: http://calvinlc.com/p/2025/06/10/thank-you-and-goodbye-pytho...
For example instead of requests, you can use http.client, instead of flask, http.server, or socket.tcpserver, or just socket. If you want sqlite, don't jump to pip install sqlite or whatever, use sockets to talk to it.
I'm not sure if this is news to you or if you already know it, but, just to be explicit -- you know that the overwhelming majority of end users aren't gonna have `pip` installed on their systems, right? And that any project with "Installation instructions" that begin with a `pip` command aren't really gonna work in the general case?
Just wanna make sure that's well-understood... it's fine if you wanna build a tool in Python, but if you expect it to be practically usable, you need to do distribution of binaries, not `pip` targets...
One of the reasons I write my tools in Go is exactly this. But if the tool was written in Go, people would complain about why not Rust and such. The point wasn’t to convey that Python doesn’t have its fair share of flaws, but to underscore that the HN crowd doesn’t represent any significant majority. The outside world keeps on using Python, and the number of Go or Rust users is most likely less than PyTorch or Scikit-learn users.
Shipping Python is hard and the language is slow. Also, tooling is bad. The newfangled ones are just a few in the long stream of ad hoc tooling over the past 20 years. Yet people write Python and will continue to do so. JS has a similar story, but it’s just a 10x worse language than Python.
There’s a famous one that does the same thing but is written in Python. So it has its issues.
My point is, pip exists in most machines. pip install sucks but it’s not the end of the world. HN crowd (including myself) has a tendency to beat around the bush about things that the majority don’t care about IRL.
[1]: https://github.com/rednafi/q
It's hard to say what's intent and what not, maybe linters with many custom rules would work best.
https://www.gnu.org/software/make/manual/html_node/Phony-Tar...
https://aur.archlinux.org/packages/python-bake-git
does this happen to support IDE like vscode?
2. Terrible installation UX.
The number of issues we've had with pre-commit because it's written in Python and Python tooling breaks constantly...
In fairness, the latter point may be finally solved by using `uv` and `uv tool install`. Performance is still a major issue though. Yamllint is easily the slowest linter we use.
(I'm tempted to try rewriting it in Rust with AI.)
Performance only matters if you're doing something compute- or disk-intensive, and then only if the libraries you're using are Python all the way down. (AI programming, at least the kind that most of us do--I don't know about places like OpenAI) is generally done with Python using libraries that use some compiled language under the hood.
And in this case--a linter--performance is almost certainly never an issue.
Performance only matters if you care about performance, and I do care about performance. If you don't, fine I guess.
Also format-on-save is a common workflow.
My comment has absolutely nothing to do with this project or its author, nor the language he has chosen. See the other comments in this thread.
I do not care whether or not this project is written in Python. Sure, he chose Python because he is more familiar with it. That is fair enough to me.
For example: I thought this or that music or movie sucked. I do not need to know how to make a song or a movie to be able to criticize it, let alone have one in a similar vain, same with books. I can criticize a book, or an article, without having written one myself on related topics.
All that said, where did I criticize? I did not criticize anything, at all.
I stated facts. Perl is indeed faster than Python, and Perl was indeed made with string manipulation in mind. I made no comment about this project or its author, thus, it was not a criticism of any sort.
FWIW it really was just about his comment, and I made two statements: Perl is faster than Python, and that Perl is especially good for string manipulation. I do not mind that he chose Python, good for him.
I personally like going to a project folder and run "make run", no matter what language or setup I have, to run the project. It enables me to unify access to projects.
I also take great care to make these runs reproducible, using lock files and other things of the ecosystems I am using, whenever possible. I work on another machine? git clone, make run. Or perhaps git clone, make init, make run.
I dare say that developers like environment variables more than before. Consider that Docker images, and hence Helm charts, are entirely controlled via environment variables. These very popular dev tools suffer from the same problem of having near-zero easy discoverability of what those environment variables might be. Yet they are very popular.
But I don't think Make usually uses all that many environment variables. You're usually specifying build targets as the command line arguments. Automake and autogen usually generate these makefiles with everything hard-coded.
Also, it makes it very easy to get started with, and it is universally available. Makes it very easy to like.
Where it's less great is complicated recipes and debugging
They can cause a lot of annoying bugs, and sometimes it's hard to track down where they are coming from (especially when dealing with stuff running in containers).
Like sudo for example.
So many problems related to that.
Make excels at what it's design to do: specify a configurable DAG of tasks that generate artifacts, execute them, and automatically determine which subgraph requires updates and which can be skipped by reusing their artifacts.
I wonder: which tool do you believe does this better than Make?
But the Internet’s make mind-share means you still have to know make.
Edit: and make lets you use make to essentially run scripts/utils. People love to abuse make for that. Can’t do that with tup.
I don't think Tup managed to present any case. Glancing at the page, the only conceivable synthetic scenarios where they can present Tup in a positive light is built times of > 10k files, and only in a synthetic scenario involving recompiling partially built projects. And what's the upside of those synthetic scenarios? Shaving w couple of seconds in rebuilds? That's hardly a compelling scenario.
Sufficiently complex project need to invole alot of wierd extra scripts, and if a build system cannot fulfil it... the n it needs to be wrapped in a complex bash script anyway.
`tup` relies on a stateful database, which makes it incomparable to `make`.
You could go for something closer to exactly what you want, but now you've got an extra set up step for devs and something else for people to learn if they want to change it.
I would say if you're looking for cli args then you shouldn't be using any wrapper like make at all. Just call the underlying tool directly. Make is for doing big high level things in the standard way, nowadays quite often in CI pipelines.