To parent's downvoters: would you kindly cut him some slack? It's OK to ask if you don't know. https://xkcd.com/1053/
Basically even though Daniel might say "I didn't change the ABI" if your code worked before and now it didn't, as far as you're concerned that's an ABI break. This particularly shows up for changed defaults and for removing stuff that's "unused" except that you relied on it and so now your code doesn't work. Daniel brings up NPN because that seems easy for the public Internet but there have been other examples where a default changed and well... too bad, you were relying on something and now it's changed, but you should have just known to set what you wanted and then you'd have been fine.
Ohh that takes me back, that feature was used heavily in the FXP warez scene (the one the proper warez people looked down on), you’d find vulnerable FTP servers to gain access to, and the best ones would support this. That way you could quickly spread releases over multiple mirrors without being slowed down by your home internet.
That's progress I believe.
I ask this because I'd like to know what practices I might want to avoid to guarantee that there is no ABI breakage in my C project.
typedef struct {
char name[50];
int age;
} Person;
vs typedef struct {
int age;
char name[50];
} Person;
Basically anything that moves bytes around in memory for data structures that are passed around. Of course any API breakage is also an ABI breakage.If you have a struct which might grow, don’t actually make it part of the ABI, don’t give users any way to find it’s size, and write functions to create, destroy and query it.
This got a bit messy because Windows also included compatibility hacks for clients that didn't set the length correctly.
Thanks! This is very insightful. What is a solution to this? If I cannot expose structs that might grow what do I expose then?
Or is the solution something like I can expose the structs that I need to expose but if I need to ever extend them in future, then I create a new struct for it?
Option 1: If allocating from the heap or somewhere otherwise fixed in place, then return a pointer-to-void (void *) and cast back to pointer-to-your-struct when the user gives it back to you.
Option 2: If allocating from a pool, just return the index.
If you add new signatures or data structures, software compiled against the previous version should still work with the new version.
In my opinion the whole issue is more important on Windows than on Linux. Just recompile the application against the new library or keep both the old and the new soversion around.
Some Linux distributions go into major contortions to make ABI stability work, and still compiled applications that are supposed to work with newer distros crash. It is a waste of resources.
Debian chose to do both: https://wiki.debian.org/ReleaseGoals/64bit-time . Wherever they could, they recompiled much of the stuff changing package names from libsomething to libsomethingt64, so where they couldn't recompile, the app still "works" (does not segfault), but links with 32-bit library that just gets wrong values. Other distros had flag day, essentially recompiled everything and didn't bother with non-packaged stuff that was compiled against old 32-bit libs, thus breaking ABI.
I wish the Python core developers had even the level of commitment to stability that developers of JavaScript frameworks do. Instead they intentionally break API compatibility every single release, I suppose because they assume that only worthless ideas are ever expressed in the form of Python programs.
But then, I normally try to stay on the leading edge. I think it’s more difficult if you leave it 2+ years between updates and ignore deprecation warnings. But with a year between minor releases, that leaves almost a two year window for moving off deprecated things.
I think that’s reasonable. I don’t experience the pain you describe, and I don’t get the impression that the Python project treats Python programs as “worthless”. The people working on Python are Python users too, why would they make their own lives difficult?
Nobody has to worry about ignoring deprecation warnings in libcurl, or for that matter in C, in English, in Unicode, or in linear algebra. There's no point at which your linear algebra theorems stop working because the AMS has deprecated some postulates. Euclid's theorems still work just as well today as they did 2000 years ago. Better, in fact, because we now know of new things they apply to that Euclid couldn't have imagined. You can still read Mark Twain, Shakespeare, or even Cicero without having to "maintain" them first, though admittedly you have to be careful about interpreting them with the right version of language.
That's what it means for intellectual work to have lasting value: each generation can build on the work of previous generations rather than having to redo it.
Last night I watched a Primitive Technology video in which he explains why he wants to roof his new construction with fired clay tiles rather than palm-leaf thatch: in the rainy season, the leaves rot, and then the rain destroys his walls, so the construction only lasts a couple of years without maintenance.
Today I opened up a program I had written in Python not 2000 years ago, not 200 years ago, not even 20 years ago, but only 11 years ago, and not touched since then. I had to fix a bunch of errors the Python maintainers intentionally introduced into my program in the 2-to-3 transition. Moreover, the "fixed" version is less correct than the version I used 11 years ago, because previously it correctly handled filename command-line arguments even if they weren't UTF-8. Now it won't, and there's evidently no way to fix it.
I wish I had written it in Golang or JS. Although it wasn't the case when I started writing Python last millennium, a Python program today is a palm-leaf-thatched rainforest mud hut—intentionally so. Instead, like Euclid, I want to build my programs of something more lasting than mere masonry.
I'm not claiming that you should do the same thing. A palm-leaf-thatched roof is easier to build and useful for many purposes. But it is no substitute for something more lasting.
Today's Python is fine for keeping a service running as long as you have a staff of Python programmers. As a medium of expression of ideas, however, it's like writing in the sand at low tide.
Isn't fixing this the whole point of Python's "surrogateescape" handling? Certainly, if I put the filename straight from sys.argv into open(), Python will pass it through just fine:
$ printf 'Hello, world!' > $'\xFF.txt'
$ python3 -c 'import sys; print(open(sys.argv[1]).read())' $'\xFF.txt'
Hello, world!
Though I suppose it could still be problematic for logging filenames or otherwise displaying them.I mean, that last part really unravels your point. Linguistic meanings definitely drift significantly over time in ways that are vitally important, and there are no deprecation warnings about them.
Take the second amendment to the USA constitution, for example. It seems very obviously scoped to “well-regulated militias”, but there are no end to the number of gun ownership proponents who will insist that this isn’t what was meant when it was written, and that the commas don’t introduce a dependent clause like they do today.
Take the Ten Commandments in the Bible. It seems very obvious that they prohibit killing people, but there are no end to the number of death penalty proponents who are Christian who will insist that what it really prohibits is murder, of which state killings are out of scope, and that “thou shalt not kill” isn’t really what was meant when it was written.
These are very clearly meaningful semantic changes. Compatibility was definitely broken.
If “you have to be careful about interpreting them with the right version of the language”, then how is that any different to saying “well just use the right version of the Python interpreter”?
> Today I opened up a program I had written in Python not 2000 years ago, not 200 years ago, not even 20 years ago, but only 11 years ago, and not touched since then. I had to fix a bunch of errors the Python maintainers intentionally introduced into my program in the 2-to-3 transition.
In your own words: You have to be careful about interpreting it with the right version of the language. Just use a Python 2 interpreter if that is your attitude.
I don’t believe software is something that you can write once and assume it will work in perpetuity with zero maintenance. Go doesn’t work that way, JavaScript doesn’t work that way, and Curl – the subject of this article – doesn’t work that way. They might’ve released v7.16.0 eighteen years ago, but they still needed to release new versions over and over and over again since then.
There is no software in the world that does not require maintenance – even TeX received an update a few years ago. Wanting to avoid maintenance altogether is not achievable, and in fact is harmful. This is like sysadmins who are proud of long uptimes. It just proves they haven’t installed any security patches. Regularly maintaining software is a requirement for it to be healthy. Write-once-maintain-never is unhealthy and should not be a goal.
That sounds like a super useful feature that would be great if more FTP servers supported it. I guess FTP itself is a dying protocol these days, but it's extremely simple and does what it says on the tin.
I think it will survive as a protocol as a fallback mechanism. Ironically I used FTP on a smartphone here and there because Smartphone OS are abysmally useless. Don't get me started with your awesome proprietary sync app, I don't do trashy and they all are.
Otherwise I do everything today through scp and http, but it is less optimal technically. It just happens to be widely available. FTP theoretically would provide a cleaner way for transfers and permission management.
Well, Android anyways. I don't know how things work in the Apple world. It's bizarre that whatever the "official" method of file transfer is is so bad. Also, managing files on Android is, on its own very bad. FTP allows connecting a decent file manager to the phone and do the management externally.
If Microsoft would also ship the library in %system32%, we would have a truly cross-platform and stable, OS-provided and -patched high-level network protocol client.
(So that probably won't happen)
Edit, I had a recollection I saw something like that before, this might be that: https://www.codeproject.com/articles/1045674/load-exe-as-dll...
It is possible to do that in the general sense though.
I'm not sure if this is accurate. Why do they include a default alias in Powershell for `curl` that points to the `Invoke-WebRequest` cmdlet then?
I've always installed curl myself and removed the alias on Windows. Maybe I've never noticed the default one because of that.
Guessing this is for backwards compatibility with scripts written for the days when it was just PowerShell lying to you.
I never needed curl on Windows, because on OSes that provide a full stack developer experience such things are part of the platform SDK, and rich language runtimes.
It is only an issue with C and C++, and their reliance on POSIX to complement the slim standard library, effectively making UNIX their "runtime" for all practical purposes.
And now for a personal opinion: I'll take libcurl over .NET's IHttpClientFactory any day.
Additionally, writing little wrappers around OS APIs is something that every C programmer has known since K&R C went outside UNIX V6, which again is why POSIX became a thing.
Just `new HttpClient` and cache/dispose it. Or have DI inject it for you. It will do the right thing without further input.
The reason for this factory existing is pooling the underlying HttpMessageHandlers which hold an internal connection pool for efficient connection/stream(in case of HTTP2 or Quic) reuse for large applications which use DI.