How long does python upgrading take you? I think Python 3.7 to 3.8 was change my docker version and push to CI... 5 minutes later it told me all tests passed and I deployed to production??
Adding new features will seldom break old stuff. It is the removing part that is hard.
(With the exception being when like a variable broke because it became a keyword, but if you made a variable something like async I am not sure you are entirely innocent).
The problem is that the entire ecosystem needs to pay attention to such warnings and it doesn't happen. As a result, these changes end up breaking code in places the program authors never touched.
I mean, if you're going to upgrade your python version, you should check your logs to see if you have had any DWs recently. If you have, upgrade your deps before upgrading python. If theres no new version available, you can't upgrade python yet.
In my experience DeprecationWarnings get turned off so frequently because I'm not forking big library to fix all of it's Deprecated uses of their dependencies.
The warnings are useless if they're not from my code, so they get turned off once globally.
well actually they are, they essentially saying that in future version your application will break.
Yes, it is not your package that is responsible, but it still affects your application, you could open a ticket, or submit a PR. If you had that message you should also hold of with upgrading to newer python until this is resolved.
> well actually they are, they essentially saying that in future version your application will break.
No, you idiot. It's a deprecated usage in a dependency.
What happens in the future is I update pandas, it stops using the deprecated numpy method, and the warning just disappears with very little action on my part.
It's a useless warning.
And submit a PR every time this happens? How about I request you pull me?
> If you had that message you should also hold of with upgrading to newer python until this is resolved.
Well, technically we already moved to 3.8, after which needed a library that only works up until 3.7.
It would be nice if library developers kept their code up to date, but that doesn't always happen. Python core devs know this; well all know this, yet they consciously screw with the core libraries with the principle, caveat emptor.
I don't understand why they don't ear-mark these changes for 4.0. These kind of things are a universal frustration with the community and they are so easily avoidable.
One issue the ecosystem currently has, really (and its not the only one, I believe it's difficult almost everywhere), is that tracking dependency-rot is hard. Unless something breaks outright, you'll never know if a library has been abandoned; and manually checking dozens of github/gitlab repos is expensive and tedious.
Pypi has an api (https://pypi.org/pypi/<pkg-name>/json) that can be leveraged to implement alerts like "this pkg last released 5 years ago, it might be dead!". I guess that's what the "security" package uses already. It would be cool if they added an option to report on this sort of thing.
> Deprecated since version 3.3, will be removed in version 3.8: The behaviour of this function depends on the platform: use perf_counter() or process_time() instead, depending on your requirements, to have a well defined behaviour.
I would be wary of any crypto library that continued to work with a warning for 8 years and no one bothered to fix it. Most likely no one was maintaining it.
I think that probably lesson learned from python 2 -> 3 migration. It's probably better to introduce many small breaks than a big one?
The python despite having 3 digit versions apparently is not following semantic versioning. The major version number generally had not much significance (the python 3 was exception, 1->2 and what they are assuring us 3->4 won't be a big change)
"Better" depends on your agenda. If it's to shove the next major version down users' throats, maybe it's easier to swallow small bites (bytes?) than one giant cow. If it's to provide a predictable and pleasant user experience, I'm less confident this is the right approach.
So I did look it up, they do provide a warning though (in the first paragraph):
"The ast module helps Python applications to process trees of the Python abstract syntax grammar. The abstract syntax itself might change with each Python release; this module helps to find out programmatically what the current grammar looks like."
This module is a bit special, because as they say, it supposed to reflect current python grammar. If they change grammar and didn't make it reflect it it would lead to different kinds of issues.
It's one thing to extend the AST and add a couple children because you added new syntax or something. That's what I think most sane people would expect from reading that quote as a language evolves. So yeah, don't assume a node has 4 children when it might have 5+ tomorrow. But it's another thing to outright remove a child entirely and to be so vague about it too. (They didn't even have it be a sentinel value like None, or even care to leave a useful error message in the stack trace! The message makes you think the code was broken all along and simply never triggered before.)
Lest you think informing users more clearly is a foreign concept to them, look at the dis module [1] for comparison. They're extremely clear the whole thing is implementation detail of CPython. If anything, after reading that, one would think your conclusion would be "ah, I should program against the AST then, not the bytecode". So you do that and then you're greeted with this nonsense! Obviously it's your fault for assuming there's anything stable to program against across minor releases.
the child wasn't removed, it was renamed from arguments to posonlyargs, this is related to the change of allowing functions to specify that given arguments are always positional.
This module exposes internals of python, and providing such guarantees would cripple development, because it wouldn't even allow for refactoring the code.
Most languages don't run into this because they don't expose internals like that. You typically extract that yourself and you accept it can change every release.
As for dis, that's very different, the bytecode is just an optimization of the CPython, python code could work perfectly fine without it, the bytecode was introduced to make it faster and other implementation won't use I would imagine that Jython and IronPython most likely don't implement it since they have their own (JVM and .NET)
ast on the other hand is expected to be identical if a language claims to be compatible with a given version, and you would expect PyPy for example to also provide this package.
That is what is great about having a good test setup. You would know pretty quick if stuff broke. Which has been basically nothing for me (3.6 -> 3.7 -> 3.8)
To be clear, the pain here was hunting down the root cause and fixing it when it shouldn't have broken in the first place, not the speed of discovering the problem after the code had already broken.
You'd be amazed out how many stale docker vm's are out there running old python versions - docker doesn't solve this issue and in fact I've seen docker images get updated less rather than more often ironically!
Well. The container doesn't magically update itself. But FROM python:3.6 -> FROM python:3.8 doesn't seem that hard. Then again a flexible package manager (with recent repos) can do this as well.
For a while my company was considering docker precisely to keep an old python app (described as python 1.5) running. The dependencies of that app haven't been maintained in a decade, will never be upgraded and will likely disappear soon as the hosts remove their mercurial repos. Having the base machine running something modern with the old python stuff in a docker image seemed like a promising solution but the stuck with RHEL7 instead.
I hope you realize not everyone running python is running web apps. There are huge companies that use python for automation and they all run from some computer grid storage (usually NFS) and are imported in the same environment as the client instead of having a server side environment separated from the client side environment.
It's not how long it takes me, it's how long it takes Ubuntu. Usually, a few years...
If I don't want to use OS packages, I have to decide to package, build, deploy, and maintain the Python packages, often in a way that separates it from the system Python package, while also managing any Python libraries we need (some of which may be C extensions and require compiling).
Whenever possible, I like to leverage upstream packaging so that I don't have to track security updates on my own.
I say this as someone who used to maintain the official Python.org RPM packages.