Floppinux – An Embedded Linux on a Single Floppy, 2025 Edition

(krzysztofjankowski.com)

185 points | by GalaxySnail 10 hours ago ago

111 comments

  • sockbot 9 hours ago ago

    Over Christmas I tried to actually build a usable computer from the 32-bit era. Eventually I discovered that the problem isn't really the power of the computer. Computers have been powerful enough for productivity tasks for 20 years, excepting browser-based software.

    The two main problems I ran into were 1) software support at the application layer, and 2) video driver support. There is a herculean effort on the part of package maintainers to build software for distros, and no one has been building 32 bit version of software for years, even if it is possible to build from source. There is only a very limited set of software you can use, even CLI software because so many things are built with 64 bit dependencies. Secondly, old video card drivers are being dropped from the kernel. This means all you have is basic VGA "safe-mode" level support, which isn't even fast enough to play an MPEG2. My final try was to install Debian 5, which was period correct and had support for my hardware, but the live CDs of the the time were not hybrid so the ISO could not boot from USB. I didn't have a burner so I finally gave up.

    So I think these types of projects are fun for a proof of concept, but unfortunately are never going to give life to old computers.

    • tombert 8 hours ago ago

      > Computers have been powerful enough for productivity tasks for 20 years

      It baffles me how usable Office 97 still. I was playing with it recently in a VM to see if it worked as well as I remembered, and it was amazing how packed with features it is considering it's nearing on thirty. There's no accounting for taste but I prefer the old Office UI to the ribbon, there's a boatload of formatting options for Word, there's 3D Word Art that hits me right in the nostalgia, Excel 97 is still very powerful and supports pretty much every feature I use regularly. It's obviously snappy on modern hardware, but I think it was snappy even in 1998.

      I'm sure people can enumerate here on the newer features that have come in later editions, and I certainly do not want to diminish your experience if you find all the new stuff useful, but I was just remarkably impressed how much cool stuff was in packed into the software.

      • flomo 7 hours ago ago

        I think MS Word was basically feature-complete with v4.0 which ran on a 1MB 68000 Macintosh. Obviously they have added lots of UI and geegaws, but the core word processing functionality hasn't really changed at all.

        (edit to say I'm obviously ignoring i8n etc.)

        • blackhaz 7 hours ago ago

          My dad used to run a whole commercial bank on MS Office 4.0 and a 386. (A small one, but still!)

          • hilti 6 hours ago ago

            I love this story where a C64 in Poland rans a Auto repair shop.

            https://www.popularmechanics.com/technology/gadgets/a23139/c...

            • cbdevidal 6 hours ago ago

              I still use Office 2010 to this day and feel like absolutely nothing is missing that I truly need. The only issues are Alt-Tab and multiple monitors have bugs. But functionality? 100%.

          • 2b3a51 6 hours ago ago

            Small, medium and large colleges in the UK ran on Novell servers and 386 client machines with windows for workgroups and whatever Office they came with. I think the universities were using unixy minicomputers then though. Late 80s early 90s. Those 386 machines were built like tanks and survived the tender ministrations of hundreds of students (not to mention some of the staff).

      • MrGilbert 7 hours ago ago

        It's wild to remember that I basically grew up with this type of software. I was there, when the MDI/SDI (Multi-Document Interface / Single-Document Interface) discussion was ongoing, and how much backlash the "Ribbon"-interface received. It also shows that writing documents hasn't really changed in the past 30 years. I wonder if that's a good or bad development.

        With memory prices skyrocketing, I wonder if we will see a freeze in computer hardware requirements for software. Maybe it's time to optimize again.

        • hnlmorg 6 hours ago ago

          Consumer laptops have been frozen on 8GB of RAM for a while already.

          Yeah you can get machines which are higher specced easily enough, but they’re usually at the upper end of the average consumers budget.

        • anthk 4 hours ago ago

          Sadly Electron developers will be fired, and C++ and even Rust ones will be highly praised. QT5/6 will be king for tons of desktop software.

          • krzyk 2 hours ago ago

            One can dream.

      • blackhaz 7 hours ago ago

        I have MS Office 4.0 installed on my 386DX-40 with 4 MB of RAM and 210 MB HDD, running Windows 3.1, and it is good. Most of the common features are there, it's a perfectly working office setup. The major thing missing is font anti-aliasing. Office 95 and 97 are absolutely awesome.

        • hilti 6 hours ago ago

          Totally agree! I‘d pay definitely $300 (lifetime license) for a productivity suite like Windows 95 design and Office 95 with no bloatware and ads. Just pure speed and productivity.

      • justapassenger 7 hours ago ago

        Last true step change in computer performance for general home computing tasks was SSD.

        • Cthulhu_ 2 hours ago ago

          I'd add multicore processors as well, which makes multiprocess computing viable. And as a major improvement, Apple's desktop CPUs which are both fast, energy efficient and cool - my laptop fan never turns on. At one point I was like "do they even work?" so I ran a website that uses CPU and GPU to the max, and... still nothing, stuff went up to 90 degrees but no fan action yet. I installed a fan control app to demonstrate that my system does in fact have fans.

          Meanwhile my home PC starts blowing whenever I fire up a video game.

        • johnisgood 6 hours ago ago

          In 20 years? That is nothing.

      • mikepurvis 8 hours ago ago

        It's crazy too to realise how much of the multi-application interop vision was realized in Office 97 too. Visual Basic for Applications had rich hooks into all the apps, you could make macros and scripts and embed them into documents, you could embed documents into each other.

        It's really astonishing how full-featured it all was, and it was running on those Pentium machines that had a "turbo" button to switch between 33 and 66 MHz and just a few MBs of RAM.

      • dfex 3 hours ago ago

        This! I have the 14-core M4 Macbook Pro with 48GB of RAM, and Word for Mac (Version 16 at this time) runs like absolute molasses on large documents, and pegs a single core between 70 and 90% for most of the time, even when I'm not typing.

        I am now starting to wonder how much of it has to do with network access to Sharepoint and telemetry data that most likely didn't exist in the Office 97 dial-up era.

        Features-wise - I doubt there is a single feature I use (deliberately) today in Excel or Word that wasn't available in Office 97.

        I'd happily suffer Clippy over Co-Pilot.

      • rkagerer 6 hours ago ago

        The curse-ed ribbon was a huge productivity regression. I still use very old versions of Word and Excel (the latter at least until the odd spreadsheet exceeds size limits) because they're simply better than the newer drivel. Efficient UI, proper keyboard shortcuts with unintrusive habbit-reinforcing hints, better performance, not trying to siphon all my files up to their retarded cloud. There is almost nothing I miss in terms of newer features from later versions.

      • pjmlp 6 hours ago ago

        Except for Internet surfing, a plain Amiga 500 would be good enough for what many folks do at home, between gaming, writing letters, basic accounting and the occasional flyers for party invitations.

        • flomo 6 hours ago ago

          Total nostalgia talk. Those machines were just glacially slow at launching apps and really everything, like spell check, go get a coffee. I could immediately tell the difference between a 25Mhz Mac IIci and a 25Mhz Mac IIci with a 32KB cache card. That's how slow they were.

          • bombcar 17 minutes ago ago

            Those machines could be pretty darn fast - if you get one and run the earliest software that still worked on. DOS-based apps would fly on a 486, even as Windows 95 would be barely usable.

          • pjmlp 5 hours ago ago

            Some of us do actually use such machines every now and then.

            The point being made was that for many people whose lives doesn't circle around computers, their computing needs have not changed since the early 1990's, other than doing stuff on Internet nowadays.

            For those people, using digital typewriter hardly requires more features than Final Writer, and for what they do with numbers in tables and a couple of automatic updated cells, something like Superplan would also be enough.

            • flomo 4 hours ago ago

              Yeah, I just posted that a lot of that software was amazing and pretty 'feature-complete', all while running on a very limited old personal conmputers.

              Just please don't gaslight us with some alternate Amiga bullshit history. All that shit was super slow, you were begging for +5Mhz or +25KB of cache. If Amiga had any success outside of teenage gamers, that stuff would have all been historical, just like it was on the Mac.

              • Gormo 2 hours ago ago

                The Amiga had huge success outside of "teenage gamers", even if in niche markets. Amigas were extremely important in TV and video production throughout the 1990s. I remember a local Amiga repair shop in South Florida that stayed in business until about 2007, mainly by servicing Amigas still in service in the local broadcast industry -- all of the local cable providers in particular had loads of them, since they were used for the old Prevue Guide listings, along with lots of other stuff.

              • pjmlp 3 hours ago ago

                Goes both ways, Mac was hardly something to write home about outside US, and they did not follow Commodore footsteps into bankruptcy out of sheer luck.

              • 2000UltraDeluxe 3 hours ago ago

                Amiga was big in Europe. No doubt they were slow though; most computers of the time were.

        • hilti 6 hours ago ago

          Or controlling the heating and AC systems at 19 schools under its jurisdiction using a system that sends out commands over short-wave radio frequencies

          https://www.popularmechanics.com/technology/infrastructure/a...

      • deafpolygon 6 hours ago ago

        it’s also proof that Microsoft hasn’t done much with office in decades… except add bloat, tracking, spyware…

      • nxobject 5 hours ago ago

        > old Office UI to the ribbon

        Truly, I do not miss the swamp of toolbar icons without any labels. I don't weep for the old interface.

    • amne 3 hours ago ago

      I used to run a cs1.6 server on an amd 800mhz with 256mb of ram in the 2000s. I'm looking these days to get a mac mini and while thinking that 16gb will not be enough I remembered about that server. It was a NAT gateway too, had a webserver also with hitstats for the cs server. And it was a popular 16v16 type of server too. What happened? How did we get to 16gb minimum and 32gb will make you not sad.

    • zokier 7 hours ago ago

      > There is a herculean effort on the part of package maintainers to build software for distros, and no one has been building 32 bit version of software for years, even if it is possible to build from source. There is only a very limited set of software you can use, even CLI software because so many things are built with 64 bit dependencies

      That seems odd? Debian 12 Bullseye (oldstable) has fully supported i386 port. I would expect it to run reasonably well on late 32 bit era systems (Pentium4/AthlonXP)

      • jabl 6 hours ago ago

        AFAIU the Debian i386 port has effectively required i686 level CPU's for quite a long time (CMOV etc.)? So if he has an older CPU like the Pentium it might not work?

        But otherwise, yes, Debian 12 should work fine as you say. Not so long ago I installed it on an old Pentium M laptop I had lying around. Did take some tweaking, turned out that the wifi card didn't support WPA2/3 mixed mode which I had configured on my AP, so I had to downgrade security for the experiment. But video was hopeless, it couldn't even play 144p videos on youtube without stuttering. Maybe the video card (some Intel thing, used the i915 driver) didn't have HW decoding for whatever video encoder youtube uses nowadays (AV1?), or whatever.

        • UncleSlacky 5 hours ago ago

          You can force YouTube to use H264 instead (via extensions like H264ify), that should reduce the processing load.

          • 2000UltraDeluxe 2 hours ago ago

            Were there actually Pentium M chipsets that could decode anything but MPEG2?

            The CPU will be struggling with most modern video formats including h.264.

          • jabl 4 hours ago ago

            Good point. Though too late in this particular case, since the battery was also busted, I ended up e-wasting the machine.

    • jsdevrulethewr 7 hours ago ago

      > Eventually I discovered that the problem isn't really the power of the computer.

      Nope, that’s a modern problem. That’s what happens when the js-inmates run the asylum. We get shitty bloated software and 8300 copies of a browser running garage applications written by garbage developers.

      I can’t wait to see what LLMs do with that being the bulk of their training.

      Exciting!

      • dariosalvi78 6 hours ago ago

        not gonna disagree with you, but, as a solo developer who needs to reach audiences of all sorts, from mobile to powerful servers, the most reasonable choice today is Javascript. JS, with its "running environments" (Chrome, Node, etc.), has done what Java was supposed to do in the 90s. It's a pity that Java didn't hold its promises, but the blame is to put all on the companies that ran the show back then (and running the show now).

        • hilti 6 hours ago ago

          Javascript is not the problem at all.

          Rookie developers who use hundreds of node modules or huge CSS frameworks are ruining performance and hurt the environment with bloated software that consumes energy and life time.

    • littlecranky67 7 hours ago ago

      I was on linux as my main driver in the early 2000s an we did watch movies back then, even DVDs. Of course, the formats where not HD and it was DivX or DVD ISOs. I remember running Gentoo and optimizing build flags for mplayer to get it working, at a time I had a 500Mhz Pentium III, later 850Mhz. And I also remember having to tweak the mplayer output driver params to get a good and smooth playback, but it was possible (mplayer -vo xv for Xvideo support). IIRC I got DVD .iso playback to run even on the framebuffer without X running at all (mplayer -vo fb). Also the "-framedrop" flag came in handy (you can do away with a bit less than 25fps when under load). Also, definitely you would need compile-time support for SSE/SSE2 in the CPU. I am not even sure I ever had a GPU that had video decoding support.

      • anthk 4 hours ago ago

        mpv and yt-dlp will fix that today.

    • forinti 3 hours ago ago

      I have a P166 under my desk and once in a blue moon I try to run something on it.

      My biggest obstacles are that it doesn't have an ethernet port and that it doesn't have BIOS USB support (although it does have a card with two USB ports).

      I've managed to run some small Linux distros on it (I'll definitely try this one), but, you're right, I haven't really found anything useful to run on it.

    • 1313ed01 8 hours ago ago

      NetBSD is probably what would make most sense to run on that old hardware.

      Alternatively you may have accidently built a great machine for installing FreeDOS to run old DOS games/applications. It does install from USB, but needs BIOS so can't run it on modern PC hardware.

      • iberator 7 hours ago ago

        NetBSD is the only 32bit modern Unix still running like a charm on 32 bit hardware. OpemBSD is second with great wifi support.

    • mrighele 2 hours ago ago

      It seems that both OpenBSD [1] and NetBSD [2] still support i386, for example here [3] you can find the image for a USB stick.

      I expect at least the base system (including X) to work without big issues (if your hardware is supported), for extra packages you may need a bit of luck.

      [1] https://www.openbsd.org/plat.html

      [2] https://wiki.netbsd.org/ports/

      [3] https://wiki.netbsd.org/ports/i386/

    • 2b3a51 6 hours ago ago

      My 32 bit laptop is a Thinkpad T42 from 2005 which has a functioning CDROM, and which can run Slackware15 stable 32bit install OKish, so I haven't tried any of this but:

      My first thought: How about using a current computer to run qemu then mounting the Lenny iso as an image and installing to a qemu hard drive? Then dd the hard drive image to your 32bit target. (That might need access to a hard drive caddy depending on how you can boot the 32bit target machine, so a 'hardware regress' I suppose).

      My second thought: If target machine is bootable from a more recent live linux, try a debootstrap install of a minimal Lenny with networking (assuming you can connect target machine to a network, I'm guessing with a cable rather than wifi). Reboot and install more software as required.

      • wink 6 hours ago ago

        I have OpenBSD running on my old 2004 Centrino notebook (I might be lagging 2-3 versions behind, I don't really use it, just play around with it) and it's fine until you start playing YouTube videos, that is kinda hard on the CPU.

        • 2b3a51 5 hours ago ago

          Yes, NetBSD and OpenBSD work fine on the 2005 T42 but as you say video performance is low. Recent OpenBSD versions have had to reduce the range of binary packages (i.e. outside of the base and installed with pkg_add) on i386 because of the difficulty of compiling them (e.g. Firefox, Seamonkey needing dependencies that are hard to compile on i386, a point the poster up thread made).

    • endgame 8 hours ago ago

      You might have some luck applying isohybrid(1) to the period-correct .iso image, making it bootable by other means: https://manpages.debian.org/stretch/syslinux-utils/isohybrid...

    • iberator 7 hours ago ago

      You can always run Linux off the dos partition with vmlinux loader. Or Slackware DOS version (forgot it's name).

      Don't lose hope. You can boot it one way or other :)

    • anthk 4 hours ago ago

      The last release of NetBSD still has drivers.

  • Fiveplus 9 hours ago ago

    The persistence strategy described here (mount -t msdos -o rw /dev/fd0 /mnt) combined with a bind mount to home is a nice clever touch for saving space.

    I don't know if that's also true for data integrity on physical magnetic media. FAT12 is not a journaling filesystem. On a modern drive, a crash during a write is at best, annoying while on a 3.5" floppy with a 33mhz CPU, a write operation blocks for a perceptible amount of time. If the user hits the power switch or the kernel panics while the heads are moving or the FAT is updating, that disk is gone. The article mentions sync, but sync on a floppy drive is an agonizingly slow operation that users might interrupt.

    Given the 253KiB free space constraint, I wonder if a better approach would be treating the free space as a raw block device or a tiny appended partition using a log-structured filesystem designed for slow media (like a stripped down JFFS2 or something), though that might require too many kernel modules.

    Has anyone out there experimented with appending a tar archive to the end of the initramfs image inplace for persistence, rather than mounting the raw FAT filesystem? It might be safer to serialize writes only on shutdown, would love more thoughts on this.

    • userbinator 9 hours ago ago

      Controversial position: journaling is not as beneficial as commonly believed. I have been using FAT for decades and never encountered much in the way of data corruption. It's probably found in far more embedded devices than PCs these days.

      • Skunkleton 8 hours ago ago

        If you make structural changes to your filesystem without a journal, and you fail mid way, there is a 100% chance your filesystem is not in a known state, and a very good chance it is in a non-self-consistent state that will lead to some interesting surprises down the line.

        • userbinator 8 hours ago ago

          No, it is very well known what will happen: you can get lost cluster chains, which are easily cleaned up. As long as the order of writes is known, there is no problem.

          • dezgeg 5 hours ago ago

            Better hope you didn't have a rename in progress with the old name removed without the new name in place. Or a directory entry written pointing to a FAT chain not yet committed to the FAT.

            Yes, soft updates style write ordering can help with some of the issues, but the Linux driver doesn't do that. And some of the issues are essentially unavoidable, requiring a a full fsck on each unclean shutdown.

        • ars 8 hours ago ago

          FAT has two allocation tables, the main one and a backup. So if you shut it off while manipulating the first one you have the backup. You are expected to run a filesystem check after a power failure.

    • M95D 4 hours ago ago

      FAT can be made tolerant form the driver just like a journaled FS:

        1) mark blocks allocated in first FAT
        If a crash occurs here, then data written is incomplete, so write FAT1 with data from FAT2 discarding all changes.
        
        2) write data in sectors
        If a crash occurs here, same as before, keep old file size.
        
        3) update file size in the directory
        This step is atomic - it's just one sector to update. If a crash occurs here (file size matches FAT1), copy FAT1 to FAT2 and keep the new file size.
        
        4) mark blocks allocated in the second FAT
        If a crash occurs here, write is complete, just calculate and update free space.
        
        5) update free space
      • ale42 3 hours ago ago

        Is this something the FAT driver is Linux can do?

    • iberator 7 hours ago ago

      Ps. On old good days there was not initrd and other ram disk stuff - you read entire system straight from the disk. Slackware 8 was that for sure and NetBSD (even newest one) is still doing it by default

    • zx8080 9 hours ago ago

      > If the user hits the power switch or the kernel panics while the heads are moving or the FAT is updating, that disk is gone.

      Makes sense, great point. I would rather use a second drive for the write disk space, if possible (I know how rare it's now to have two floppy drives, but still).

    • ars 8 hours ago ago

      > If the user hits the power switch or the kernel panics while the heads are moving or the FAT is updating, that disk is gone.

      This isn't true, I commented lower in the thread, but FAT keeps a backup table, and you can use that to restore the disk.

  • hilti 9 hours ago ago

    I remember the QNX Demo on a 1.44 MB floppy disk. It booted straight into a full blown window manager and had a basic web browser. That was 1999 and I never saw anything like that afterwards.

  • M95D 5 hours ago ago

    I wonder if formatting the floppy is necessary. Could syslinux or maybe lilo load the kernel directly from raw floppy sectors and have the initrd appended to it and the commad line directly inside the kernel via CONFIG_CMDLINE? I know u-boot can do it, but that's 8+ MB.

    As an alternative, isn't ext2 smaller by having no FAT tables?

  • heinternets 9 hours ago ago

    I miss the floppy disk sound and the anticipation then joy of finally loading into the OS.

    • szszrk 6 hours ago ago

      The omnipresent coil whine in almost every laptop I got in past 15 years, gives me at least that nostalgic noise that says "computer is working".

      Whish coil whine was configurable :)

      • tensility 5 hours ago ago

        At a very low level, it is. I know the individual that made a "diagnostic" for the floppy drive while working as a tech on the Apple I and Apple II designs which caused the drive to whine in patterns that were distinctly ... orgasmic.

  • jdub 9 hours ago ago

    > After 5 minutes I got freshly burned floppy.

    oh god

    • userbinator 8 hours ago ago

      That is an indication of someone who grew up in the CD-R/RW era.

  • urbandw311er 6 hours ago ago

    There’s something really lovely about this project - especially as they’re using the last kernel from May 2025 before x486 support was removed. It feels like somebody lovingly mending their car for one last time or something similar. (I’m tired but you can probably find a cuter metaphor)

  • cbdevidal 6 hours ago ago

    It’s amazing to me that the floppy is still a relevant target unit. Just large enough to be useful, small enough to be a real challenge to use well. I don’t see the same passion for 700MB CDROM distributions, probably because the challenge just isn’t there.

  • mobilio 6 hours ago ago

    25 years ago i used floppyfw

    https://www.zelow.no/floppyfw/

    to setup small router on 486 with 12 MB ram and run flawless. Later i get Linksys WRT54GL and decommissioned that machine.

    • enricotr 6 hours ago ago

      Me too, was CoyoteLinux.

      • Snoddas 6 hours ago ago

        This brings back memories. I used CoyoteLinux to surreptitiously share my ADSL connection with my SO. This was against my provider's ToS at the time.

  • arthurfirst 3 hours ago ago

    The original software for the ISS (space station) was stored on a single floppy disk. Not sure about density but one of the engineers told me.

  • dirkc 7 hours ago ago

    We used to call the 1.44MB (3.5inch) disk stiffies, since they are rigid, while the physically bigger disks we used to refer to as floppies.

    And they used to fail all the time, especially when you had something that spanned more than a single disk.

    • urbandw311er 6 hours ago ago

      Is that name used with an eyebrow raised, or did that particular double entendre not make it out of the UK?

      • dirkc 5 hours ago ago

        My level of English was very basic during the age of stiffies, so that double entendre never occurred to me at the time

  • 6LLvveMx2koXfwn 9 hours ago ago

    Did I misremember downloading Slackware to 12 floppies in 1997?

    • flomo 7 hours ago ago

      Before then, a local clone store had an 'insane deal' on floppy disks, and they came with Slackware. I had a Mac, and the floppies weren't very good so.

    • gattilorenz 7 hours ago ago

      MuLinux was also a floppy-based “live” distro, with optional floppy disks for X11, programming languages, etc.

    • cricalix 7 hours ago ago

      12‽ I'd swear the Slackware I downloaded was closer to 30+. On dialup. Via a VAX. Using FTP to go from internet to the VAX box, then Kermit from the VAX to the DOS PC using Procomm Plus. Write it all, start the install sequence, find out that the 18th disk was bad. Reboot. Rinse. Repeat.

      X disks were X11. There were also the A,B, C etc disks.

      Then there was the Coherent install, with massive manual on ultra thin paper with the shell on the front.

    • stackghost 8 hours ago ago

      Probably not. Pretty sure it was Puppy Linux (among I'm sure others) that could be run on just two floppies. I used to have this old 933MHz Coppermine system that I took when a medical office was going to throw it out, some time in the early 00s.

      The HDD was borked but it had a 3.5" bay that worked, so I got a floppy-based distro running on it. I later replaced the drive and then made the mistake of attempting to compile X11 on it. Results were... mixed.

  • zoobab 6 hours ago ago

    I was making routers our of old PCs (486 or early pentiums) with 2 network cards (3com or ne2000) back in 2000 with floppies and CoyoteLinux. Installed 10s of them in the students houses.

  • grewil2 7 hours ago ago

    Since it’s an 1.44M image I assume they use 3.5” diskettes. The terms floppy and diskette are used as synonyms today, but the different names make sense since floppies are flexible and “floppy”. Diskettinux?

  • yjftsjthsd-h 9 hours ago ago

    I thought Linux dropped driver support for real floppy drives. Did that not happen, or am I missing something?

    • jabl 6 hours ago ago

      Someone was still working on some minor cleanups in August 2025: https://lore.kernel.org/lkml/20250825163545.39303-1-andriy.s...

      (That mail also mentions the floppy driver is "basically orphaned" though. But evidently it's still there and builds.)

      Maybe you're thinking of the floppy tape (ftape) driver, which was removed back in the 2.6.20 kernel. Though there's a project keeping an out-of-tree version of it working with recent kernels at https://github.com/dbrant/ftape

    • creatonez 9 hours ago ago

      Don't think so? Linux should still support almost all builtin motherboard floppy controllers, for the platforms it still runs on. ISA floppy controller support is probably not as comprehensive, but not because anything has been dropped.

      • yjftsjthsd-h 8 hours ago ago

        Huh, yeah looks like I misremembered.

    • madduci 9 hours ago ago

      No but I find this line interesting:

      The Linux kernel drops i486 support in 6.15 (released May 2025), so 6.14 (released March 2025) is the latest version with full compatibility.

      • zx8080 9 hours ago ago

        Any chance of backporting changes to be able to run on older hardware?

  • amelius 6 hours ago ago

    I remember the days when Linux came on 50 floppies.

    • tensility 5 hours ago ago

      If I recall correctly, when you wanted all of the bells and whistles that Slackware had to offer, it required 72 floppy disks.

  • hn_throwaway_99 9 hours ago ago

    What's a floppy?

    • vaylian 7 hours ago ago

      https://en.wikipedia.org/wiki/Floppy_disk

      It's basically what people used before USB sticks. But it was also the storage medium that software was sold on, before CD-ROMs became widespread.

    • mrbluecoat 9 hours ago ago

      Floppy is a race of robotic jackalopes, known for their floppy ears. A "Single Floppy" is a rare subset of that species where only one ear flops down due to a random mutation of their hardware.

      • deafpolygon 6 hours ago ago

        Embedding Linux has the characteristic of making the single floppy highly territorial and aggressive.

    • GJim 4 hours ago ago

      Its a real life 3D save icon.

  • ggm 9 hours ago ago

    mgr on sun hardware probably could have come close

  • tensility 5 hours ago ago

    Bring back Slackware?

  • jstrebel 7 hours ago ago

    Ok, impressive, but - why? No current computer has a floppy disk drive anymore. The Web Page claims building such a disk is a learning exercise, but the knowledge offered is pretty arcane, even for regular Linux users. Is this pure nostalgia?

    • cbdevidal 6 hours ago ago

      If you have to ask why this is not for you. Why climb a mountain that’s already been climbed hundreds of times? For the challenge.