Let’s rewrite Linux kernel in Python
Fun fact!
The Asahi Linux drivers for the Apple M1 GPU were originally written in Python: https://asahilinux.org/2022/11/tales-of-the-m1-gpu/
GPU drivers in Python?!
Since getting all these structures right is critical for the GPU to work and the firmware to not crash, I needed a way of quickly experimenting with them while I reverse engineered things. Thankfully, the Asahi Linux project already has a tool for this: The m1n1 Python framework! Since I was already writing a GPU tracer for the m1n1 hypervisor and filling out structure definitions in Python, I decided to just flip it on its head and start writing a Python GPU kernel driver, using the same structure definitions. Python is great for this, since it is very easy to iterate with! Even better, it can already talk the basic RTKit protocols and parse crash logs, and I improved the tools for that so I could see exactly what the firmware was doing when it crashes. This is all done by running scripts on a development machine which connects to the M1 machine via USB, so you can easily reboot it every time you want to test something and the test cycle is very fast!
Good for testing and iterating, but what about performance? Though I guess getting everything right is more important right now, translating it into another language will probably require less work that way
It has already been translated into rust. Python wasn’t ever intended to be used in the “real” driver, but I thought it was a fun anecdote none the less.
No in JavaScript
The prophets have foretold it: https://www.destroyallsoftware.com/talks/the-birth-and-death-of-javascript
Nobody’s built a supercomputer powerful enough to run a python version of even Linux Lite Edition.
Redstone it is, then.
I mean, someone made a barebones Linux work in scratch
Like… Kernel written in scratch?
Yeah, it uses Ed as it’s text editor
What language was that jpeg compression written in?
You ever heard of lossless compression? Well they developed lossfull anti compression, it compresses and decompresses the images so many times that the added artifacts create a larger file than original ! Impressive ain’t it?
we do live in the future
Potato
YOU WERE NEVER MEANT TO ACCESS STARCH DIRECTLY
Fools haven’t even written it well! Translated:
STOP WRITING
-
MEMORY WAS NEVER SUPPOSED TO BE AESSED DIRETLY
-
YEARS OF PROGRAMMING yet STILL ODE IS STILL WRITTEN with memory vulnerabilities
-
Wanted to aess memory diretly anyway? We had a tool for that: It was alled “ASSEMBLY”
-
“Yes please give me NULL of something. Please give me *&* of it” - Statements dreamed up by the utterly deranged
LOOK at what Programmers have been demanding your Respet for all this time, with all of the omputers we built for them
(These are REAL programs, written by REAL Programmers):
??? ??? ???
They have played us for absolute fools
what does this omment even say??? i ant see it
LUV ‘ODE
LUV LINUX
‘ATE WINDOWS
‘ATE ‘LOSED SOURCE
SIMPLE AS
Thank
-
Does anyone even know what Windows is written in?
Originally Windows was written in assembly and ran on top of DOS, but since Windows 2000 and XP, it’s been exclusively running on the NT kernel, which is written primarily in C, with some C++ in there as well.
The actual userspace is mostly C++ and C#.
C is fun. Thats why I use it :P
The same way juggling chainsaws is fun I suppose. :)
w… windows 10?
I think I saw online that Windows was written in C++
C++, but a very ugly and oldschool dialect of it.
Well… I love C, just the semplicity of the syntax and low level aspect of it makes it the only language you need. F*ck high level languages
low level aspect of it makes it the only language you need
about that https://queue.acm.org/detail.cfm?id=3212479