Summary
Brett Cannon has been a long-time contributor to the Python language and community in many ways. In this episode he shares some of his work and thoughts on modernizing the ecosystem around the language. This includes standards for packaging, discovering the true core of the language, and how to make it possible to target mobile and web platforms.
Announcements
- Hello and welcome to Podcast.__init__, the podcast about Python’s role in data and science.
- When you’re ready to launch your next app or want to try a project you hear about on the show, you’ll need somewhere to deploy it, so take a look at our friends over at Linode. With the launch of their managed Kubernetes platform it’s easy to get started with the next generation of deployment and scaling, powered by the battle tested Linode platform, including simple pricing, node balancers, 40Gbit networking, dedicated CPU and GPU instances, and worldwide data centers. Go to pythonpodcast.com/linode and get a $100 credit to try out a Kubernetes cluster of your own. And don’t forget to thank them for their continued support of this show!
- Are you bored with writing scripts to move data into SaaS tools like Salesforce, Marketo, or Facebook Ads? Hightouch is the easiest way to sync data into the platforms that your business teams rely on. The data you’re looking for is already in your data warehouse and BI tools. Connect your warehouse to Hightouch, paste a SQL query, and use their visual mapper to specify how data should appear in your SaaS systems. No more scripts, just SQL. Supercharge your business teams with customer data using Hightouch for Reverse ETL today. Get started for free at pythonpodcast.com/hightouch.
- Your host as usual is Tobias Macey and today I’m interviewing Brett Cannon about improvements in the packaging ecosystem, the promise of WebAssembly, and his recent explorations of CPython’s interpreter
Interview
- Introductions
- How did you get introduced to Python?
- As a core contributor to CPython, a member of the steering Council, and the team lead for VSCode’s Python extension, what are your current areas of focus for the language?
- One of the PEPs that you were involved with recently introduced the pyproject.toml file for simplifying the work of building Python packages. Can you share some of the background behind that work and the goals that you had for it?
- Since its introduction a lot of people have co-opted that file for other project configuration. What was your reaction to that, and if you had foreseen that usage what might you have changed or added in the PEP to account for it?
- What are the long term impacts on the packaging ecosystem that you anticipate with the standardization efforts that are happening?
- Another area where there is a lot of attention right now is being able to target additional deployment environments such as the browser, with web assembly, and mobile devices, with projects like BriefCase and Kivy. You had a recent post where you posed some questions about the true nature of Python and the possibility of removing pieces of it to simplify building for these other runtimes. What is your personal sense of the minimal set of features that we need for something to still be Python?
- How have projects such as MicroPython and PyOdide influenced your thinking on the matter?
- You have also recently been writing a series of articles about the implementation details of different syntactic elements of Python. What was your inspiration for that?
- What are some of the interesting or surprising details that you encountered while unwrapping the way that the interpreter handles those syntactic elements?
- How have those explorations helped you in your efforts to identify the core of Python?
- Recent releases of Python have brought in some substantial changes to the interpreter and new language features (e.g. PEG parser, pattern matching). What are some of the other large initiatives that you are keeping track of?
- What are your personal goals for the near to medium term future of Python?
- What are the most interesting, unexpected, or challenging lessons that you have learned while working on the Python language and related tooling?
- If you were to redesign Python today, what are some of the things that you would do differently?
Keep In Touch
- brettcannon on GitHub
- @brettsky on Twitter
- Blog
Picks
- Tobias
- Cold Brew Iced Tea
- Loki on Disney+
- Brett
- Rich
- Textual
- The physics facts included in all of the Python 3.10 release announcements, e.g. you will never see a green star
Links
- Brett’s Blog
- Python VSCode Extension
- Python Steering Council
- Python Package Authority
- UC Berkeley
- Vancouver, BC
- Squamish, Musquiam, Tsleil-waututh First Nations
- Pascal
- Python
- C
- O’Reilly
- PyCon US 2021 Steering Council Keynote
- Python Developer-In-Residence
- PSF Visionary Sponsorship
- Setuptools
- Pip
- Python Wheels
- PyPI
- PEP 518
- PEP 517
- PEP 621
- pyproject.toml
- Flit
- Enscons
- PyPA Build
- PyOxidizer
- Pex
- Shiv
- cx_Freeze
- cibuildwheel
- Thomas Kluyver
- Poetry
- Vaults of Parnassus
- MicroPython
- CircuitPython
- Desugaring Python Blog Series
- JupyterHub
- PyOdide
- JupyterLite
- ANSI C99
- PyPy
- Jython
- IPython
- ncurses
- Kivy
- Briefcase
- Toga
- PEP 401
The intro and outro music is from Requiem for a Fish The Freak Fandango Orchestra / CC BY-SA
Hello, and welcome to Podcast Dot in It, the podcast about Python and the people who make it great. When you're ready to launch your next app or want to try a project you hear about on the show, you'll need somewhere to deploy it. So take a look at our friends over at Linode. With the launch of their managed Kubernetes platform, it's easy to get started with the next generation of deployment and scaling powered by the battle tested Linode platform, including simple pricing, node balancers, 40 gigabit networking, dedicated CPU and GPU instances, and worldwide data centers.
Go to python podcast.com/linode, that's l I n o d e, today and get a $100 credit to try out a Kubernetes cluster of your own. And don't forget to thank them for their continued support of this show. Are you bored with writing scripts to move data into SaaS tools like Salesforce, Marketo, or Facebook Ads? Hightouch is the easiest way to sync data into the platforms that your business teams rely on. The data you're looking for is already in your data warehouse and BI tools. Connect your warehouse to Hightouch, paste a SQL query, and use their visual mapper to specify how data should appear in your SaaS systems. No more scripts, just SQL. Supercharge your business teams with customer data using Hightouch for reverse ETL today. Get started for free at python podcast.com/hitouch.
Your host as usual is Tobias Macy. And today, I'd like to welcome back Brett Cannon to talk about some of the improvements in the packaging ecosystem, the promise of web assembly, and some of his recent explorations of CPython's interpreter. So, Brett, can can you start by introducing yourself for anyone who isn't familiar with you? So as Tobias said, my name is Brett Cannon.
[00:01:48] Unknown:
I am the dev manager for the Python extension of Versus Code, and that is how I pay for my open source habits, being a Python core developer for the past 18 years. And I'm also a member of the Python steering council. Work a lot on the Python Packaging Authority, the PyPA. It's not really that authoritative. It was a bad name choice. And, yeah, I am calling from Vancouver, Canada, which I should mention is on the unceded traditional territories of the Musqueam, Squamish, and Tsleil Waututh First Nations.
[00:02:19] Unknown:
And that's pretty much it for me. And do you remember how you first got into Python?
[00:02:24] Unknown:
Yes. I do. So when I was doing my undergrad at UC Berkeley, there was a entrance exam to get into the intro computer science course. And at the that time, I didn't know object oriented programming. I was afraid it was gonna be on that exam. And so I cast around looking for a programming language that taught object oriented programming, because at that point, all I knew was scheme and c. And a little bit of Pascal, but it barely counted. So I started to look read around, finding a language to learn, and everything I kept coming across was saying, like, at the time, like, oh, well, there's Perl, but that should be, like, your 6th language.
But there's this other language called Python that's kinda small and not really that widely used, but it's great for teaching. And so, oh, okay. Cool. So I got the O'Reilly book at the time, and I learned it. And I was like, oh, this is great. This fits my brain as we like to stay in the community. And I've just been using it ever since. That was the fall of 2000.
[00:03:27] Unknown:
And as you mentioned, you've been a core contributor to Python for a number of years, and to be specific, the c Python interpreter because these days, there are a number of different pythons to choose from. Yeah. You're also a member of the steering council, which has taken over the role that Guido used to fill with as the BDFL. And, also, as I believe you mentioned, you are the manager for the team that develops the Versus Code extension for Python. And given the amount of overlap and all the different touch points that you have with the Python community, I imagine that you spend a lot of time thinking about Python in various aspects. And I'm curious with that Venn diagram of responsibilities and roles, what your current areas of focus are for the language, both as a core contributor and as an end user.
[00:04:14] Unknown:
It's interesting. Being a member of the steering council, there's 2 aspects to it. There's the you can kinda get an idea of this if you watch our keynote from PyCon US 2021, which is up on YouTube now. There's the kind of where's the language going in kind of having an idea of where we think things should go, but there's also this aspect of the steering council isn't there to dictate what happens. It's very much a not group of last resort per se, but we are there to help drive consensus, and as I like to say, we're there in case the entire volunteer team quit tomorrow, We're basically the backstop as it were.
So it's a little funky because, like, we're we're for instance, right now, we're trying to hire Devon residents, because 1 of the visionary sponsors that the PSF now has, Google, part of their visionary sponsorship money is to go towards hiring someone full time to work on CPython and Python language. And so we're trying to do that. But for instance, we're not personally on the steering council as a council trying to drive any specific change in the language. Right? Like, we're not writing a pep as a group to try to make change happen. We instead wait for people to bring PEPs to us to then either delegate to the appropriate person or us as a group of 5 to try to make the best call we can if there's not obvious consensus from the development team as to what to do about a PEP. And usually, if the PEP comes to us, chances are there is no consensus because otherwise, basically, it's rubber stamped. There's no conversation.
So from that perspective, I'm personally not pushing any peps at the moment. And as a steering council, we're not really driving any changes to the language. We're just trying, basically, to try to improve the development process around Python so that we can keep the project healthy and running and something that we call use and enjoy. The next thing over that I'm doing is probably all my work in the packaging community. And the general theme of that, if you look at what I've been doing over the last couple years in that realm, is I'm trying as best I can to help build standards and shared packages in the packaging community to make it easier for other tools to flourish as it were.
So, basically, from my perspective, if you think about, like, set up tools in PIP, When you use those tools because you want to or because you feel you have to. Basically, packaging in Python is so old and long and so much entrenched history, we don't really get an opportunity to have a single tool that we all get behind. I mean, PIP is probably the closest, but even then, it has a lot of backwards compatibility concerns and various other things that make it the way it is, same with setup tools. And a lot of it's driven by convention. Right? There's not a way for other tools to necessarily step in because we haven't necessarily written everything down that needs to be seeing that we all standardize on to make sure that another tool could potentially be used by someone to generate the exact same outcome, whether it's a wheel or install a wheel or what have you.
The other thing is is there's a lot of code that's pretty much universal around these various PIPs that we are writing and or have written that don't have a package behind them that people can just rely on to basically get the right outcome. Now that isn't to say that we shouldn't have standards to make sure that people could reimplement the exact same thing in their own way, But it is definitely helpful to at least have a baseline they can build off of such that that boring bit that everyone has to do isn't repeated. So that's basically what I've been doing in the packaging world is trying to drive PEPs that kind of help standardize things, typically around binary artifacts or configuration.
And then along those lines, also help try to make sure that we end up having packages on PyPI that tool builders actually can rely on and use so that they can work on more of the innovative stuff that they wanna tackle versus the boring stuff that everyone has to tackle. In the packaging ecosystem, that's definitely somewhere that I've been seeing a lot of activity lately on a number of different fronts. And,
[00:08:34] Unknown:
you know, as I was preparing for this conversation, the kind of tagline that I put in just to give myself some focus of where to bring this conversation was modern Python. And I think that 1 of the pieces that's happening to modernize the language and modernize the ecosystem is this broader compatibility with packaging and how to make packaging and distribution easier because that, as you said, is a very long and storied past for the language that hasn't always been easier straightforward. And I know that 1 of the peps that has your name on it that is driving in this direction is related to pyproject.toml, which has been sort of co opted in a number of different ways, but there are also a number of other peps and efforts that are underway. You know, I'm thinking of PEP 527 and whichever number it is that is trying to provide the pipe packages directory to be more akin to how NPM handles dependencies where you just have a path that has all your dependencies to make it easier to relocate them and make reusable environments without necessarily having to rely on VM, which is another thing that has a long and storied history in Python. So it's PEP 582.
Yes. So I'm wondering if you can just kind of give some overview of the different efforts that are happening in the space and how pyproject.toml specifically is incorporated into some of those efforts and maybe helps to kick start some of these other initiatives.
[00:10:01] Unknown:
I think it's the first PEP I ever wrote for packaging is PEP 5 18 that introduces pyproject.toml file. And I should, mention I'm just a coauthor on Nathaniel Smith, Donald Stuff, and I think I might be forgetting somebody else, coauthored that PEP. So just not all me, whether you love it or hate it. But that PEP was basically driven by the fact that setup dot py as a configuration format from setup tools without being too blunt, it's kind of a bad idea. Right? You don't want executable code as the way to configure your packaging data. Right? It's not introspectable.
It's a potential security hole. It's messy, and it leads to people oftentimes doing some things that are not necessarily good. So we had a drive to want to standardize that kind of thing. At the same time, there is also this desire to, once again, as I've said earlier, trying to make it so that you don't feel like you have to use set of tools, that you can just choose to use set of tools. As such, we needed a way for a project to tell a tool such as PIP or Wheel or whatever tool you wanna use to build your binary artifact, whether it's an S test or a wheel or what have you, basically, what do you need installed to actually do the build? Right? Like, how do you tell a build tool, I need setup tools installed? Because up to this point, we've all just assumed it was setup tools. Right? This is why when you create a VIM set of tools and PIP are just automatically installed because it's just a massive assumption that set of tools is what you need. But if you want to kinda let a 1, 000, 000 build tools flourish, you gotta have a way for people to be able to say, hey. As a project, I actually don't want set of tools. What I want is split, for instance.
So that's when we create a pyproject.toml and introduce this build system section to that file that basically says, okay, here are the build tools I need to have installed. That was it. It was not much else than that, very straightforward. We had more plans, eventually, along the lines, and that kinda led to PEP 517, which lets you define an API to drive those build tools. Right? Like, this is why you can have why PIP can do your builds. It's actually not PIP doing the builds. PIP just knows how to read a Pyproject dot toml file to install the build tools you need and then how to call those build build tools. But it's basically kind of crept its way down.
Actually, funny enough, PEP 5 18 was originally specified to only let you specify the build tools required, but other tools started to co opt that. We specifically actually said in the PEP, this is not meant for use by anyone else except build tools, and then other projects like Blackstar to use it for the configuration. So we actually went back well, if I went, it was still provisional and actually took that paragraph out and just said, okay. As long as you specify everything in a tool section, it's fine. Well, we pre realize everyone wants to kinda consolidate their all their configuration files down to 1 single file because I don't know about you, but having, like, 5 different files just to, like, manage my Linter and my formatter and my build tool and all this stuff drives me a bit bonkers, so I kind of appreciate that the community is kind of rallying around Pyproject.toml that way.
So that kind of led to a way to at least build and specify your build tools. Now, the next step after that is configuration. Right? It's like, okay. I can specify what build tool I want. Like, I can say I want FLIT, and I want this version of FLIT, and here's how to call FLIT to get what you want. But now we have the metadata of my project, right, that's consistent across all the projects. If you ever look at a wheel file, which, by the way, is just a zip file with a fancy file extension, it has a file in there called metadata. And you can actually if you go into your site, dash packages, you'll find a bunch of directories named after your 1 of the projects you have installed, and then the project name dot disk dash info.
If you look in that directory, you'll find a metadata file, all caps. And in there is a bunch of metadata about the project. And, like, this is how when you do, like, pip.list, it can list stuff, and it can tell you what version it's installed and all that because that's all recorded in that file. The deal is that file is standard, but there was no standardized way for people to write that information down. Right? Like, what's the name of your project? What's its description? What are the keywords you want to show up in PyPI? Like, what's its homepage? That kind of thing. Right? None of that was standardized. All this stuff is usually historically been in setup tools, in your setup dot py, and that called it setup function from setup tools or in your setup dot c f t. But once again, no standard anywhere. Everyone did it their own way.
And so I wrote PEP 621, where basically all of that metadata that is completely consistent across all projects that everyone needs to provide to generate that wheel and get some sample on PIPI can now be specified in pyproject.toml. And 1 of the key reasons I wanted to do that was to 1, is to meet that goal and dream of no more executable code to gather your static information about your project so that it's easy to introspect. Right? Like, if I have to run your setup dot py just to find out what version you're using or what the name of your project is or what your dependencies are, I can't just necessarily crawl up all the projects on GitHub to get a bunch of information about anything. Right? I have to run your code, and that's just not safe unless I'm doing, like, in some throwaway container or something that's all locked down and I don't have to worry about. Way too much work for me. I just wanted to wait just to read a TOML file to get some information.
The other thing, though, is when you standardize the information, it means the amount of information that's unique per tool becomes that much smaller, which makes it that much easier to switch between tools. Say you wanna use Flit, but then you wanna go try some other tool like Enscon or whatever build tool you want. If the huge trunk of common metadata is specified in Pyproject.com on the standard way, it's only the little bit difference between each tool that you have to change. So by doing that, I wanted to kind of help open up the ecosystem to make it so that experimentation between these build tools isn't quite so cumbersome. Right? Because as of right now, if I was using set of tools, for instance, and I was using accept dot py, and then I wanted to try FLIT, it's a lot more work. It's a lot harder to convince people, hey.
Please try FLIT out. It's nice and fast or whatever. Or for whatever reason, you wanna try something else out, not to pick on set of tools specifically, by the way. But if you can lower that barrier of entry for someone, it allows people to try things more out and to find that workflow that works for them and then also to basically just give other tools a chance. Right? Because if there's 1 thing I've learned being involved in the packaging world, packaging code is the very last thing anyone ever touches and changes, right? Like you write that 1 project once, you will copy and cargo cult that code forward for decades. Right? Like, convincing someone to try something out that's new is really hard.
Tough thing for us to do when we try to introduce new standards is to, how do we convince people to put the time and effort to change their packaging code to use the newer standard? Because if you've got something that's been working for you for 5 years or something, I don't wanna change this. My users don't see this. I still get a wheel in the end. Why do I care? So by lowering this, the goal was to try to make it so that experimentation is a bit easier. It's that much easier. Those people who are willing to just do that little extra bit of work versus all that extra work will be able to try some new stuff out.
And that's kinda where Pyproject.toml sits at the moment is as a standard way to specify what build tools you need installed to build your wheels and your source distributions, how to call those build tools, and now how to specify the common metadata between projects, as long as your build tool supports it. It's very, very new, so some tools are working towards using it, but not everyone. So for instance, set of tools doesn't hasn't taken acceptance of it yet. I don't think anyone's tried to do a PR quite yet because they're very busy people.
[00:18:25] Unknown:
And another downstream impact as well of having this more standardized way of being able to specify the build and metadata information and make it easier for the build tools to be at least somewhat interoperable is that as you're working across the ecosystem, you don't necessarily have to move outside of your comfort zone to say, you know, I've been using Pip for this project, but now if I wanna contribute over here, I have to use Poetry because they've got their own way of doing things. And I'm going over here now, and now I have to use an entirely different build tool because they've got a different way of doing it. And so as a contributor and as a user, it also reduces the switching cost and the on ramp to new projects if you're trying to either use or contribute code.
[00:19:10] Unknown:
Yeah. So a good point about that actually is the PyAPA has a project that's literally called build, and it's a CLI tool that will build a project. It can do that because we have standards now on how to drive build tools. Right? You don't have to use wheel directly. You can just use the build tool, which literally does nothing but, like, build project directory, and it'll do the right thing. It'll create the temp directory, console the projects, create a virtual environment, install the stuff into it, take your code, run the build tool, produced wheels, nest disk, and then say, here you go. And because of that, it's allowing, as you said, people to kinda have either very simplified workflows, but also use whatever tool they want. Right? Whether you wanna use PIP to drive the the build or the little build tool, build to drive it, or whatever you want. Right? Once it's standardized, you can have the workflow that you wanna have, and it won't matter what I wanna use compared to what you want to buy us because we're using standards. So whatever. I might wanna use build. You might wanna use PIP. It's still gonna produce the same thing in the end, so have at it. It's great.
And another benefit I actually forgot to mention, I'm in the process of designing and hoping to see implemented is when everything's a standard, you don't have to reimplement everything. Right? Like so for instance, we don't have common code in the packaging world for reading the metadata file. And I'm right now in the middle of having a chat with some various folk in the packaging community about trying to design API to do that. Because why should every install tool need to how to read that file or every build tool know how to write that file when it's a complete standard that everyone has to do anyway? So we're still kind of at that somewhat I don't wanna say nascent. It's not that early, but we're still at that step in the whole progression of trying to make packaging easier in Python of trying to just lower the amount of code that everyone has to write bespoke and trying to raise the amount of shared code so that the cost of maintenance is amortized across everybody, and it's a lot easier for everyone to work on. Because, once again, another benefit of standardization, it all comes out the same way anyway. So might as well just share as much code as possible so that you don't have to do the worrying stuff. Absolutely. And I'm wondering what your thoughts are on the potential
[00:21:27] Unknown:
downstream impact of doing things like distribution of code. So I'm thinking in terms of things like CX freeze or high oxidizer and then also things like shiv or PEX for being able to produce self contained packages for being able to just put it on a box and run it and how these standards in terms of the build systems and the metadata might be able to either work to in the favor of those tools or, you know, additions to or new standards that will help in terms of being able to simplify distribution of these self contained binaries to make it easier to share your packaged and compiled code with other people?
[00:22:09] Unknown:
The trick, I think, with self contained distributions of a Python project is typically, right now, as of today, more about how do you get a copy of Python that you can just move about on your disk. Right? Like, how do you have a relocatable installed Python? Because, historically, Unix builds use relative paths or absolute paths to stuff only on your OS or your in your system so that if I send you a zip file of something, they won't necessarily work. So I think that's really the bigger hurdle right now around that than it is around the packaging story. Now getting wheels on the right platforms and all that does typically help make this a bit easier because then at least use all the tooling to pull everything together and have that wheel, like, using CRE build wheel, which actually another project, from pipe from that just joined the PyPA this month, actually.
So we're trying to make it easier for everyone to build those wheels on all those platforms, and then these larger tools like Briefcase and such that wanna package up Python into a single thing, can at least grab those wheels and have those all pulled together and not have to do a full rebuild. But I think, honestly, getting Python itself working, then typically CPython is usually what people are talking about, So it's that it can just be plopped down on someone's disk no matter what OS, no matter what directory. I think right now that's the bigger challenge that needs to be tackled. Dream of, well, go and risk it to have single binaries. Why don't we?
[00:23:33] Unknown:
Right? So I think that's the challenge that has to be tackled for that. In terms of the broader sort of packaging and distribution ecosystem, what are some of the other near to medium or potentially long term impacts that you see of the standardization efforts that are happening now and some of the, you know, potential successive efforts that might follow in those footsteps to simplify the work of being a user, being a contributor to Python and its ecosystem, just more broadly being able to actually make use of all the code that we're writing. 1 direct benefit I've seen already is
[00:24:06] Unknown:
I've mentioned Flit a couple times. For those of you who've never used Flit, Flit is a packaging tool designed to work only with pure Python packages. Right? No extension module support. But because it doesn't worry about that, it's very simple. And, Thomas Kloiver, the creator of the project, also gets to be very opinionated in it. So for instance, any of you set up tools users out there, how much do you like your manifest dot in file? How many times have you accidentally left a file out of that thing? Right? FLT doesn't have that concern when it comes to SDIS because opinion that FLT takes is if it's checked in to get, it should be in the SDIS.
That's it, real straightforward. Source situation is basically just a tarball of stuff that's checked into your Git repo. Done. What about creating the wheel file? Well, because it's pure Python, that's nothing more than just copying some files into a zip that has a WHL extension with the fancy real tag name. So all you have to do is tell it what directory is your stuff in. Versioning? Well, it gets the version from your package from the dunder version attribute. So if you just specify that in your code, it will just read it from there. You don't even have to worry about specifying it somewhere else.
So Flite takes a very clean, simple, straightforward approach of, if you're using Git and you keep your version and another version in your package, you're done. After that, it's just writing out your metadata, which you have to do no matter what tool you use, and that's it. And if you specify the name, the same as the directory, you don't have to specify what directory your represents your package. It's just automatic based on the name. So it's really nice and straightforward. So and the only reason Flip can exist, partially, is because it implements PIP 518 and PIP 517.
Right? Because it implements those standards, you could use it, I could use it, it doesn't matter, and PIP can totally manage it. Right? If it downloads an s desk, it knows how to use it to build a wheel because it sees there's a Pyproject dot toml that says, hey. Install FLIT, and, oh, here's how to run Flit to get the wheel and get your wheel, and then PIP can just install it. There's no worry about, well, is this setup dot py or something else? Are they having to shim in some way to look like setup tools for setup dot py or anything like that? Right? So there's already a flourish of new tools. Right? So the fact to me that Flit exists and that people use it is already a testament that this work is is has been worth it and will continue to be worth it to try to drive this. But for me, that's kind of the goal here is more tools that are able to do things that people need in an easier fashion. Right? Like, I'm a productivity nerd from when it comes to programming. Right? Like, this is why I work on the Python extension for Versus Code because I wanna see developers be productive. It's why I work on Python, right, whose entire goal is to make developers more productive.
So for me, from a packaging perspective, it's how do I help all everyone else be productive when it comes to packaging, whether it's installing or building or what have you. I mean, we're very lucky that Python has the flexibility that it has to have all these extension modules and to be able to have all this be the glue code of the world as it were, But because of that, we've historically had a really complicated relationship with packaging, because we're the language that has to do its own thing for its own language, and c, and Fortran, or whatever other crazy language someone wants to compile with Python to make it so they have a nicer language to work with for the libraries.
It got very complicated. Right? Like, setup tools is the way it is because it tried to be the first tool that had to try to deal with all this, and it's grown over the decades to try to support all of these crazy use cases. And people love having custom workflows when it comes to packaging. I don't know why, but that's the 1 place people always feel like they can just totally go nuts in terms of the way they do stuff and be completely different. Maybe because it never sees the outside of the world, so people are just like, yeah, sure. Why not? No one's ever gonna get mad at me over this. It's never gonna leave the walls of my work. But because of that, it's made proper support really difficult.
So what I hope this will all lead to is a place where people are able to create the tools that actually let them have their workflow that they want. Just continue to grow. Have more tools that do exactly the thing you need the way you want it to happen, whether that's bespoke just for your team or you personally or someone else who builds it and gives you what you want. But I just want to do what I can to lower the difficulty of packaging by, a, making it so that if you write a tutorial about packaging, it applies to all the tools, not just set of tools or not just Flynn.
But also, b, that you're able to find that tool that does make your life simpler so that you don't feel like packaging in Python is painful and somehow more painful than any other language out there. And to continue just to keep lowering the difficulty such that it's just a thing, It's never gonna be totally simple and straightforward when you're throwing in c compilers and stuff. I mean, I don't know any language that has that down pat that has to deal with cross language like this. Right? Like, every language I know that's done it really well only worries about themselves typically. And so we're in a really tough spot trying to deal try to tackle this problem.
That's my dream. Right? It's just I could find a tool that fits my brain, works the way I want, is nice and simple, and just lets me have the workflow I wanna have. Another avenue
[00:29:36] Unknown:
of the packaging story and the distribution story is the efforts that are underway to make Python more viable for other targets beyond just computer you know, laptops and servers, you know, moving into things like the browser or mobile devices. And there are a number of projects going in these avenues. You know, there are things like Kivi and Briefcase and TOGA for being able to build for mobile. There's web assembly for being able to build for the web, and then there are a whole slew of things that will take your Python code and turn it into JavaScript for you to run on the browser. So things like Python or transcript. And I'm just wondering what your thoughts are on the challenges that exist in trying to be able to target those systems. And as I was preparing for this conversation, I read a post that you had that was kind of postulating on what is the true sort of core of Python? What is its spirit? And if you take these things away from Python, is it still the same language?
And I know that you left it off saying, I don't really have any answers, but these are some of the things I'm thinking about. And I'm wondering if you have come to any conclusions since you wrote that about what is the core of the language, what are the things that we can take out of it to make it easier to build for these new targets while still being able to maintain the sort of heart and soul of what Python is and why we all love it? I don't have an answer.
[00:30:59] Unknown:
Sorry. That has not changed. But I do have goals around this. So, yeah, so there are a lot of wonderful projects that I've been trying to you can call it solve the mobile problem or solve the browser problem. I don't know about you, but every time I ever read any of these articles about, oh, look at Python growing and look at it take over the world, blah blah blah blah blah, and there's always the caveat of, except for mobile or except for the browser. Right? Like, it's the 1 place you always see mentioned as caveats about how you can use Python anywhere but those places. And that's not to say transcript and Python, like, haven't tried to do the browser thing or Briefcase hasn't tried to make things work on mobile. But for me, I try to think at a more fundamental level. Right? Like, those projects are trying to make Python work there by trying to, for instance, compile Python to JavaScript or try to make CPython work on mobile.
And I'm trying to think of a more low level if you just look at the language, let's not worry about the implementation. What does that look like? What does it take if I if we're gonna, like, okay. Here's the Python, the language that we know that has link has a spec. What does that look like if we try to make that happen in the browser or or on mobile? And so what that kinda led me down was thinking about, well, okay. Do we really need the whole language? Because 1 of the funny things, having been doing Python since 2000 and being a core dev since April of 2003 is I predate Python being widely popular. Right? Like, I predate PyPI.
Back when I started the vaults of Parnassus with its animated GIFs of wall sconces was the place you sent zip files with links to your host for that file to download to distribute Python code. Right? Like, I come from a time when people didn't know what the heck Python was. Right? And if they did, they just usually only knew white space mattered and that was the weirdest concept in the world. Right? So living in a world where I can't remember the last time I met a software developer who didn't know what Python was, still blows my mind to this day, let alone Python's now on Mars.
Right? Like so because of that, I don't view all of our projects on PyPI as a leading indicator of Python's popularity. I view it as a trailing indicator. So for me, it's not about whether or not you can use something off of PyPI, which might shock some people. But I think PyPI shows that Python is popular, not that it made Python popular. Now not to say it doesn't help, but I don't think it's the entire reason that people use Python. It just happens to be very handy. So because of that, I don't view it as, okay, how do we do it so that everything on PyPI can be used on a platform? So I don't feel encumbered to have to worry about worldwide CPython compatibility. Right? Like, PyPies does an excellent job because they're going after that group of people, but I I'm trying to think at an even lower level. Like, what really defines Python and thus that is really what you have to get onto a platform, not what do you have to do in order to download any arbitrary PyPI package.
So from that perspective, my brain started to go like, okay. Well, what is the subset of Python that really defines Python? What I kinda, in my brain, refer to as the minimum viable Python. Right? And if you look at CircaPython and MicroPython, for instance, they're not fully compatible with the language spec. Right? You cannot run everything that you could write in pure Python and have it run on MicroPython or CircuitPython, which makes sense. It's a different platform. It doesn't use everything off of PyPI because it doesn't need to, and yet it's still widely popular and used and something and a lot of people get a lot of enjoyment from and productivity from. Right? So, I mean, it's all over the place. So that made me also realize, like, well, okay. With MicroPython, CircuitPython could be popular.
Why can't we think about mobile and the browser from this perspective? And so what that led to was this series of blog posts where where I refer to it, I'm desugaring Python. If you look at Python, a lot of its syntax is actually just syntactic sugar around other Python code. Right? Like, addition is just a bunch of method calls on objects. It's nothing fancy, and you can actually completely implement the plus operator without using plus. And there's a whole lot of other bits of Python that devolve that way. And so I've been doing these blog posts to try to figure out, like, where's the bare minimum where I could actually reimplement the rest of the language based on the smaller subset. Right? Like, do I really need to have a clean direct implementation of plus, or can I just figure out, like, oh, when I see plus, I need to make these calls to these methods? And really, as long as I could do method calls, I really can do all this other stuff.
Same with import, with, what have you. And then while doing this, I'm also starting to look at, okay, from a basic syntactic level, do I really need to implement all the dynamism of the language? Right? Like, the fact that you can dig into, like, dunderdijk of a live object or a class, or, heck, take an object and even change what class it points at. Right? Like, there's some crazy stuff you can do in Python. Now that's not to say we didn't have reasons for opening up that ability, but there's a cost to it that that flexibility makes it harder to implement the language. Because if you don't implement the language fully from a perspective of the way CPython does it and is able to do it, it's really difficult.
And so trying to not only tear down the syntax, but also reimplement these core bits of Python and pure Python with an eye towards, okay, what did I really have to make work? Do I really need to expose this at the Python level? Can I actually get away with not exposing something such that it can still feel like Python and more or less be Python just without necessarily exposing every single detail that CPython happens to expose? Would that still be a productive target for people to work towards? And so my overarching goal here is finish these blog posts, figure out what that core of Python is that you can reimplement other parts of Python on top of, and that I wanna write a compliance test suite for Python, both full spec, but potentially also coming together with a group of people to kinda define, alright, what is that minimum bit of Python that if you don't define this and don't implement this, you're not really Python anymore. You shouldn't even call yourself a subset of Python. Right?
Like, it's just too not there to be useful or too weird to not call yourself a subset of Python anymore? What it can we define that to say, like, if you implement this, this is enough to teach someone how to program, teach someone Python, and have them to be able to build off of to learn grown up CPython or what have you. Because if you can define that small subset, that target is smaller than all of CPython language spec, which means trying to implement that is way easier potentially on mobile and the browser using WebAssembly than it is everything.
And so that's my question. Once again, trying to break everything down to its lower level to figure out what's that small kernel that has to be there because that's a heck of a lot easier because, as I said, to me, PyPI is a trailing indicator of Python's popularity, not the leading indicator. So I think there's a possibility that if we can figure out what the smaller subset is that's useful for people and we build it, people will then target that even if they can't bring all the PyPI with them because they'll just find Python's view of the world and its productivity and fitting your brain and all this stuff useful enough to still use it and not ignore it while still being part of the overall Python community.
Having said all this, I realized this is a massive undertaking. I am doing this in my spare time, multi year, if I'm lucky right like this is not moving fast if you read my blog post you'll know it's like I took a break because I had to worry about Python 3.10 and a bunch of stuff on the steering council. So it's been kinda fits and starts lately. But that's my dream, really, is to figure out what is that subset of Python that could potentially be implemented in WebAssembly, mainly because my assumption is is if you have WebAssembly, you have the browser automatically, but every mobile device has a browser as well, so there's no reason to think there isn't some WebAssembly target on there. So if you do WebAssembly, hopefully, you can do both. And to really kind of target the subset that of Python that people find useful and hopefully be able to actually compile down to WebAssembly, not even have necessarily an interpreter and just very targeted straight out small size, whatever. Because in my brain, mobile and web says small download size, fast performance.
It's not MicroPython and CircuitPython have an interesting restriction of not only small size, not only for download on your IoT device, but also small footprint while running. While CPython and PyPy, for instance, have the opposite restrictions. Right? Download size is not a concern, and you can use as much memory as you need to to be performing. Who cares? RAM's cheap. Well, I wanna be right in the middle where download size is small, but memory usage doesn't have to be worried about too much because most phones, even cheap phones, have multiple megs. Right? It's not measured in kilobytes like it is for Python, circuit Python, typically.
So I think it's also a different performance characteristic to potentially have to worry about. So that's where it's at. I have spousal approval to do this. My wife, Andrea, took a data science course, and they were using JupyterHub to do their assignments and homework and such. And when she first used it, she connected, used it today, closed her laptop, opened it the next day, and wasn't connected to the kernel anymore. It's like, what's going on? It's like, oh, well, they're using this technology called JupyterHub. It needs a live connection to the server. Just restart your browser. It'll reconnect. Like, why isn't it just in the browser? Why do I have to connect to anything? Why isn't it just running in Chrome?
And I was like, oh, well, no one's implemented Python like that for the browser. It just isn't how it works. And he said, we should fix that. Okay. Just to be clear here, this is a massive project over multiple years. Like, you know, those blog posts you helped me proofread, that's working towards this. Like, you do realize there'll be more of those blog posts. Like, I want you to understand what you're asking here. And she, like, literally thought for a split second and went, yeah. No. You should still fix this. So I had spousal approval to actually work towards getting Python into the browser specifically. Now that isn't to say other people haven't done it. Right? Like, Kyodide, for instance, that came out of Mozilla and is now its own separate project, has a compilation of CPython that does run completely in the browser and a bunch of the science stack. Right? There's a project called JupyterLite, for instance, that now exists that will actually run classic Jupyter Notebooks in the browser using Pyrodite as the back end. So there's no kernels. It's all in the browser. But Pyrodite is very much browser oriented and is it once again, it's a WebAssembly compilation of CPython directly.
I personally wanna see if we can take it another step farther in WebAssembly and have a pure WebAssembly target for Python the language. And so just recompilation of CPython to WebAssembly. So really reaching out there. Crazy goal, but maybe it'll work out, and it'll actually all be worth it. Otherwise, I've at least learned a lot. You've
[00:42:45] Unknown:
been a contributor to CPython for a long time. CPython is where the majority of the center of gravity for Python, the language of the is centered. And there are a number of other implementations and runtimes of Python, so PyPy, you know, Rust Python, iron Python, Jython, MicroPython. And, you know, they're all working to be compatible with CPython. And I'm wondering given your efforts to understand, you know, what is this minimal subset of Python that we can focus on to say, this is actually the core of Python. If you're not compatible with this, then you're not actually part of the same language. Once you have that kind of understanding, what do you see as the potential for CPython to become sort of a footnote in the, you know, long story of Python going forward and for people to be able to build new implementations that don't necessarily have to carry forward the entire legacy of the CPython interpreter as it was written, you know, back in the early nineties.
[00:43:47] Unknown:
So this is no secret. I've made this opinion known publicly that I view CPython as the reference implementation of Python and the portable c version of it. Right? Like, if you need a version of Python, as long as you have an ncc 99 compiler, you have Python on that platform. Right? Like, it's very compatible and very usable in a lot of places, and it's the way we test language ideas. But I also view the language as separate from CPython. Right? Like, I work on the CPython downloading. You're downloading. But for me, I don't see CPython necessarily going anywhere just because it fills that niche position of the portable and CC version of Python that everyone relies on and can run anywhere.
It might not be the perfect fit for IoT, and it might not be the perfect fit for people who wanna run, like, crazy server or whatever, but it definitely is good enough for the wide gamut of stuff. So I don't see it going anywhere even if something else picked up popularity right just because once again reference implementation c is still the world's common subset of everything, so I don't see that going anywhere. But I would personally love to see a clear separation of the CPython interpreter from the Python programming language to kinda hit home the fact that, sure, CPython exists, but as you pointed out, PyPy exists, Jython exists, AugerPython exists, MicroPython exists, CircuitPython exists. We've mentioned Python and transcripts.
Like, there are a lot of alternative implementations that all have their own little quirks. Right? Like, PyPy works very hard to be bug for bug compatible with CPython because it's trying to be the the implementation you use on the server. Right? If you're gonna do long running anything with Python, that's the thing that you wanna lean towards and try out and see if it works for you. But a lot of other ones, like, as is MicroPython, CircuitPython, they have pages that fill out list the ways they are not compatible, and people seem fine with it. And they're not going to ever be fully compatible because they have performance constraints that make them go, like, yeah, we could, but it's not worth it. It just won't lead to our goals of this small performing on an on your light bulb implementation of Python.
So I'm fine with it. Like, I am totally fine that not everyone implements the full version of the language based on CPython. Right? Like, the version language that I see being specified in the specification is the full version if you can pull it off. And these are all the language decisions we've made around all these different aspects of the language that lead to the most full featured experience that give you the most flexibility as possible. But you just don't always need that flexibility. When was the last time you wrote a meta class?
[00:46:44] Unknown:
It's been quite a while,
[00:46:46] Unknown:
if ever, actually. Yeah. It ain't a common thing. So based on that fact, would it be that horrible if you didn't have metaclasses? Maybe, maybe not, there's stuff in the standard library that require it, but it's some things, not all things. Right? Like, when was the last time you changed what base class an object pointed at? I don't think I've ever done that, at least not on purpose. Yeah. Exactly. Right? There's a lot of little quirky things you can do in the language that it's there. And once again, there's power to it, but it's just not necessary. So it's great that CPython allows for this and flexibility is there, and being an interpreter totally allows for it. And it's fantastic that people are trying to speed it up like Guido and his team, on the perf team at Microsoft are trying to do, where they're trying to be backwards compatible while still making Python hopefully up to 50% faster for 311.
Like, if they can accomplish that, that's gonna be amazing. But I view that as the CPython goal, not necessarily the Python language goal. Right? Like, I view that as a way to keep that aspect of the language in the community healthy and running and giving them what they feel they want in order to be successful. But there are other aspects of the community that need something else from the language. And I think that's an interesting way of looking at it. Right? Like, everyone needs a different thing from the community. Right? Some people need full out speed and compatibility with everything on PyPI.
Some people need a really small run time that runs in the kilobytes, and some people are somewhere in between. Like, I need Java compatibility. I need dotnet compatibility. I need browser compatibility. Everyone's got their own thing. And for me, the real drive in all this is I wanna make it so that people who are already in the community can continue to be in the community because Python is where they need to be and also as a way to bring more people into the community. Because for me, I'm kinda known for a phrase I once said, like, I came for language and I stayed for the community, and that is very, very, very true still to this day. Like, all this work I put in it's not for my benefit. I don't need to be improving packaging. I don't need to be improving the language anymore. I've learned what I've learned. My goal at work is to create the best editing experience and an editor as possible for Python developers, and that can mean, for instance, trying to improve packaging to make it so it's easier for us to build stuff around it, but it's not necessarily as direct of a thing. Now Microsoft still gives me time. Like, I get 20% time plus whatever I can steal on the side to just contribute just because I want to. Right? And I do that because I want to keep this community going. And part of keeping this community going is making sure Python is where the community is going and to try to bring more people into it because I think we have such a great community here. I want more people to be able to enjoy it and be a part of it. And that also means being where they happen to be.
And for me, the biggest hole we have is people who need to be in the browser and people who need to be on mobile. So I'm doing the best I can to try to fill those gaps so that we cover as wide of a gamut as possible of where people need to be so that they can stay in the community, and we can bring more people in to enjoy this wonderful place we've created and have a good time.
[00:50:00] Unknown:
And as you've been going through these exercises in the, you know, desugaring blog series. I know that you've been doing some pretty deep dives of figuring out, okay. This is what I'm trying to understand now, and then you end up, you know, 5 levels removed. And I'm just curious, what are some of the most interesting or surprising or unexpected details that you've uncovered in the process?
[00:50:23] Unknown:
To this day, the biggest surprise still is the bug I found in CPython. If you go and read the blog post on in place binary arithmetic. So this is where you do, like, plus equals or star equals to do in place addition or in place multiplication. When I did that blog post, I reimplemented that syntax using the right roles and stuff for all of them. And when it came to doing in place power, my test kept failing. And I was like, why my test failing? This is working the same as it plus equals is working and minus equals, like, I got it in place addition, subtraction, multiplication, division, all the other operators working totally fine. And then my power test kept failing. Like, what the heck's going on? Because usually when this happens, when I do these blog posts, if there's an error, it's on my side. Right? Like, CPython is very clean and stable in a well running project. Right? So the chances that CPython has been wrong was zilch at this point. It's always been I made a mistake or read something wrong or made a bad assumption, and I had to tweak my implementation of the equivalent stuff to make it work. Or the test was just flat out bad.
But in this case, I dug into the c code, and lo and behold, there's a bug. It was doing it wrong. It actually wasn't following the the right rules for in place power. It's been fixed in 310, thanks to an external contributor. And that still is the biggest shocker was that I actually found a bug in Python due to this, which is also 1 of the motivators I have for trying to come up with a compliance test for the language, because even CPython doesn't get it right all the time. And we don't have that separation, as I mentioned earlier, between Python, the language, and CPython the interpreter.
So finding that bug made me realize that there isn't a clean separation between CPython, the interpreter, and Python language when it comes to our test suite. Right? Like, when you want to test whether or not you've implemented the language, the way most other implementations do it, they just run the whole test suite for the entire standard library. I mean, that can help, but it's not a clear signal as did I get plus right? Did I get subtraction right? Did I do this other thing right? It's just not oriented towards implementations to verify that they have implemented this part of the language spec.
So my hope is we can kinda create this compliance test that a lot of other languages now have these days for alternatives to be able to target to help make sure that they actually implement the things they need to without having to run the entire test suite for the standard library to do that same verification. Right? Like, you shouldn't care whether the test asyncio tests passed. Right? Like, whatever. That doesn't teach you compliance about whether you implement, method calls appropriately. So that's really been probably the biggest surprise. After that, it's just refamiliarizing myself with a lot of the c code and how classes work and all the slots at the c struct level and how when you instantiate classes, how various pointers get copied up into the subclasses. Like, when you inherit at the c level in Python, you don't point down. Right? Like, when you think about Python, right, if you ever call the MRO method on a class, it returns the list or the tuple specifically of all the subclasses. And that's the order that Python goes down when it's trying to find the method that you asked for on an object.
At the c level, that's not how it works. At the c level, it's a bunch of structs. And when you define a subclass of another thing, what it does is it goes, oh, well, you're subclassing this thing. What I'm gonna do is I'm going to now copy all of these function porters from the subclass into my class so that you get faster performance. Right? Because having to walk that tree and Python's a bit slower versus at the c level where it's just like, oh, well, it's all right there. And if it's not there, it's not implemented. Done. Simple. I just have to reorient myself to that's the mechanism we use for performance reasons and, oh, yeah, it doesn't work the way at the c level like it does in the Python world. Like, how does my brain works? I know how this all works in Python, but having to do the translation, it's been an interesting refresher.
But, yeah, finding that bug is still the biggest surprise.
[00:54:39] Unknown:
Are there any other aspects of some of the recent work that you've been doing on Python and your views on, you know, what the Python of today and tomorrow is going to be or should be that we didn't discuss yet that you'd like to cover before we close out the show?
[00:54:53] Unknown:
I'm working on a PEP for, a lock file format because that's another thing in the packaging world that we don't have a standard on that everyone seems to kinda have a their own version of. And there isn't really a need specifically, necessarily, I think, for everyone to have their own log file format. So I'm trying to work behind the scenes with some packaging tool maintainers towards, hopefully, a shared log file format so we can kinda get that done. And then after that, honestly, I just have a bunch of libraries I wanna see written to try to alleviate some of the redundancy. And then I think, actually, most of my goals for packaging will be done at that point, and then I'll focus mostly on web assembly and this whole crazy idea of mine.
No. Otherwise, that's pretty much it. I'm really excited about our Devon residence. We haven't hired anyone yet, but hopefully that'll happen soon. And I'm really looking forward to what they're able to do in terms of helping us with our massive PR backlog and helping get that cleared out as best we can and trying to improve our whole dev process. Otherwise, I just hope we're able to, as a community, just keep moving forward and keep being a great place and hopefully seeing everyone in somewhere in 2022 after having not seen anyone in person for it's gonna be 3 years come PyCon 2022, which still boggles my mind. And I don't know. Maybe we'll get Python on the 3rd planet.
[00:56:15] Unknown:
We'll land it on Venus. There we go. Alright. Well, for anybody who wants to follow along with you and keep in touch and see all the other things that you're working on, I'll have you add your preferred contact information to the show notes. And so with that, I'll move us into the picks. This week, I'm going to choose a couple of things. So the weather's been getting warm here at least in fits and starts. And rather than have my usual cup of hot tea in the morning, I've been experimenting with some cold brew iced tea. So being able to just throw some tea bags in a jug, put it in the fridge overnight, and then have something in the morning that I don't have to think about is pretty great. And then the other thing is I've recently started watching the Loki series on Disney Plus that they put out, and so far, that's actually been very good. So definitely recommend watching that if you're at all into the Marvel Universe.
[00:56:58] Unknown:
And so with that, I'll pass it to you, Brett. I will definitely second the Loki 1. That's definitely turned out to be great. WandaVision was a lot of fun too, and Loki's turned out to be equally good in a different way. So I will definitely so my picks are kind of Python related sort of. So 1 of them is I don't know if you've covered the library rich that's on PyPI. It's a project to help you give more rich output to your terminal. Right? So if you want a very nice, like, progress bar, that kind of thing, formatted markdown in the terminal, all this kind of stuff. It's really great for that sort of thing.
Well, the creator has actually started work on a new project he's calling Techtool, and it's to create a TUI or text user interface library based on Rich for the terminal. The reason I bring this up is because I looked into TUI's several months back and just shook my head at Nkurses, like, uh-uh. I ain't doing that. That's too much work. Like, there's enough stuff in the terminal where you want something a little bit fancier than a CLI and maybe calling the input function, but you don't wanna go full blown, like, cute or kivy or what have you. I you want something just kinda simple in between.
And so I've been kinda clamoring and hoping for someone to come up with a new TUI library, and so far, at least what Will shown in his GIFs on Twitter have been rather impressive. So I'm very hopeful slash anxious for him to continue on that library and gain that somewhere. My other 1 is kind of Python related, but kind of not. If you look at the Python 3.10, release announcements, you will notice that there is a section called, and now for something completely different, to steal a line from Monty Python, that Pablo Salgado, our release manager, puts in, that's random physics facts. If you don't know, Pablo has a PhD in physics, and I think it involves black holes, actually.
So he puts in a little random facts about something physics that's beyond me, but really cool typically And encourage anyone when you do go and look at the release announcements, any of the 310 releases, like, 3.10 beta 2 just came out, to make sure to read to the end and to read that statement. For instance, for the beta 2 release, public tells us why there's no such thing as a green star and why you will never see a star look green.
[00:59:24] Unknown:
And I think that's kinda cool, and I'm enough of a physics and science nerd to just don't fully understand it, but I think it's really neat to know why we will never see a green star in the sky. Yep. It's those little things like that that make us stay for the community. It's just all those weird little tidbits that people sneak into the language and the ecosystem around it. So definitely, we'll have to take a look at those and appreciate you pointing that out.
[00:59:45] Unknown:
Yeah. It's funny. Right? Like, if you read PEP 401, which was an April fools joke where we introduced the concept of the fluffle, right, and which is very Warsaw, the friendly language uncle for life for those of you who don't know the joke, and it's 401 after April 1st. That was not only an April fools joke for the community. It was also an April fools joke on Barry. There's a whole story behind that where we came up with the idea, and Berry had an idea to name Tim Peters the fluffel, and and so we turned it around and did it to Berry. And so Guido and I had to kinda trick Barry into thinking we were working slowly on the PEP so that the day we launched it, he would just find out it was on him and not not just the community. Anyway, I've had people come up to me over the years, you know, like, if you do from vendor future import, Barry's fluffel, I believe. Yes. It lets you to replace the not equal sign, which is exclamation point equals with less than greater than, which Barry argued for heavily in the Python 2 to 3 transition as what we should keep as the not equals sign.
He lost just because of familiarity. Everyone else knows bang equals, but he really loved that syntax. So we implemented it as part of the April fools joke, and we put it in the language. For a while there afterwards, I would have people come to me and say, like, can we take this out now? The joke's over. I'm like, no. It's funny. It's part of the history now. We're not taking out it's being like asking us to take out anti gravity out of the standard library. Right? If you import anti gravity, brings up the classic SKCD all about Python.
From the near future, import braces. Right? The language is full of these little jokes that people don't know about. And for a long time, we actually had people, like, really worried that management would find out about these jokes and think the language wasn't serious, and they couldn't bring it to their management to say, no. No. We should use this language. Oh, but it's named after Monty Python, the comedy troupe? That's not serious. Let's use Java. Right? And I think the fact that we've reached this point where yeah. I don't think anyone cares anymore that it happens to be named after a British comedy trooper that has these little inside jokes all over the place and it just doesn't take itself that seriously has become a detriment. So, yeah, I do appreciate that part of the community. Yeah. And 1 of my favorite sort
[01:01:55] Unknown:
of meta meta jokes is the import this implementation where it's arguing for as simple as better than complex is actually implemented as a root 13 text encoder with the with the decoder implemented as part of the import statement.
[01:02:10] Unknown:
Mhmm. Yeah. Tim had a lot of fun, I think, when he did that 1.
[01:02:15] Unknown:
Alright. Well, thank you very much for taking the time to join me again. I always appreciate being able to have conversations with you. So I appreciate the time you've taken today, the time you've taken for the community, and all the work that you do. So thank you again for all of that, and I hope you enjoy the rest of your day. Thanks, Tobias. So same to you, and I hope to see you in Salt Lake City in 2022. Thank you for listening. Don't forget to check out our other show, the Data Engineering Podcast at dataengineeringpodcast.com for the latest on modern data management.
And visit the site of python podcast.com to subscribe to the show, sign up for the mailing list, and read the show notes. And if you've learned something or tried out a project from the show, then tell us about it. Email host at podcastinit.com with your story. To help other people find the show, please leave a review on Itunes and tell your friends and coworkers.
Introduction and Guest Welcome
Brett Cannon's Background and Python Journey
Current Focus Areas in Python Development
Standardizing Python Packaging
Impact of Packaging Standardization
Future of Python Packaging and Distribution
Python on Mobile and Web Platforms
CPython and Alternative Implementations
Upcoming Work and Community Goals
Closing Remarks