← Back to context

Comment by SCUSKU

5 days ago

I don't really understand how or why Bad Apple is becoming the de-facto graphics rendering "hello world" but it's fun to see in real time. I came across this demo which uses Bad Apple for demonstrating high FPS hypermedia:

https://data-star.dev/examples/bad_apple

2 reasons:

1. The creator is extremely cool about remixes and fanuse. In many ways touhou is the OG modern internet fandom in a way that previous ones weren't. Your bad apple video will not be taken down even though it has the same audio as all the others.

2. The shadow puppet format is recognizable at seemingly any resolution. I have seen examples in a 3x3 grid even. On top of that, it only has two colors (black/white, 1/0) so its dead simple to convert the video frames into any other format you can imagine with only a 'hello world' understanding of what you're doing.

  • > it only has two colors (black/white, 1/0)

    As the article mentions, the source video is grayscale, not monochrome. Grays are used for motion blur, glow, gradients, etc.

    • True, I think it's more accurate to say that the video can be shown in only black & white while still remaining mostly true to the original content, which isn't the case for most videos. So you can play bad apple on two-color display and still easily be able to recognize it as bad apple, which definitely contributes to its "hello world for video" status.

  • > its dead simple to convert the video frames into any other format you can imagine

    Maybe read the linked article about that. ;)

  • For the same reason, if you play rhythm games you’ll notice that every single one has touhou music.

    • The toby fox music (undertale, deltarune) has been making the rounds in that scene as well.

      Commercial music is really hard to utilize in this way. ZUN and Fox know the music is special to the fans and that they like to see it in different places, so they put the effort in to accommodate it.

The DOOM standard is here: https://www.reddit.com/r/Doom/comments/1c0g0mi/i_made_doom_i...

built on a fully programmable cpu in redstone

IRIS Computer Specs:

- Custom 16 bit CPU

- 8 kB of RAM

- 64 kB of ROM

- 1 kB texture ROM

- 96x64 pixel screen - 16 colours

- Floating point unit (add sub mult div sqrt)

- 173 redstone tick clock

- No 3D graphics hardware acceleration (entirely done in software)

- Runs programs written in URCL

- Runs at 1 million ticks per second thanks to MCHPRS server - which is 5.8 kHz clock speed

  • > - Runs at 1 million ticks per second thanks to MCHPRS server - which is 5.8 kHz clock speed

    I had to go look into this, because that's shockingly fast. The latest Intel CPUs have a 6.2 GHz clock rate when TVBing, so each Minecraft tick runs in ~1,000 CPU cycles. Each thread handles 65k surface blocks (256x256 plot), so that means each cycle is processing upwards of 10 surface cycles in the most lenient circumstances I can think of.

    I went to go look into how on Earth they're doing that with all this Redstone around; there are some docs at [1] if anyone else is curious. It looks like they have some kind of Redstone "compiler" that converts the Redstone blocks into a graph, and execution happens on that graph.

    That's crazy impressive. It does make using Minecraft feel a little silly, to me and perhaps only me. They have an input step where they basically parse the map, convert it to a graph that seems to resemble the AST of an LLVM IR, and then execute it. It makes Minecraft feel like a very awkward scripting language to me; why stack 16k Redstone cubes manually just so they can parse it into an IR instead of just scripting generating the IR or something like that?

    1. https://github.com/MCHPR/MCHPRS/blob/master/docs/Redpiler.md

    • Minecraft has a unique and diverse community, it's what gives it the staying power

      I would imagine they used one of many tools for copy/paste in Minecraft, especially with the repetitive nature of the components.

      Mojang (and now Microsoft) have taken ideas from the community to make the game better. It would be interesting if they incorporated the Redpiler idea, because redstone can get laggy

      2 replies →

One of less often mentioned characteristic trait of Touhou songs including original Bad Apple!![1] is that, at least to me, it more resembles a data bus status display than music; it makes a lot more sense to imagine it as listening to even bits of a 16-bit bus tied to instruments as MS-DOS boots, than music with regular tempo and musical measures. That's to be expected as these songs were created for hardcore PC-88/PC-98 shooter games by the developer of Touhou games all by himself without formal education in musical theory. I think that makes it rather familiar to embedded hardware engineers than most other music.

Another factor is nicovideo.jp / nico-tech community developed from 2ch/futaba culture. Lots of users with way more domain expertise than pay or financial ambitions threw in their skills into remixes for fun(many were STEM students back then). Unidentified FPGA wizards, motor driver experts, video editors, would just come by and drop psychedelic videos. It was absurd. So absurd that Maker Faire Tokyo once put suspected nico-tech dress-shirts into a quarantine zone in a separate venue to save face for ambitious t-shirt webdevs(that was naughty, and lead to creation of nico-tech meetups, and also was never repeated). That absurd content quality-quantity density sure had created inertia for Bad Apple!! PVs.

Undoubtedly one last key element was that the PV was monochromatic(okay, grayscale). That's probably why it wasn't one of other ones from the golden age of nicovideo.jp.

1: https://www.youtube.com/watch?v=Yw5HTeT_dis

That the video is entirely monochrome while also extremely fluid and intricate makes for an interesting duality when applying it to a technical problem, and that it's also a very pleasing and impressive work of art in of itself, I would think gives it many of the qualities that demosceners in particular appreciate.

https://www.pouet.net/prod.php?which=63591

8088 Domination used an awful lot of technical tricks to show Bad Apple, Full frame rate, Full screen on a real 8088 Intel Processor, back in 2014. As far as I know, that was the first, most recognized use of Bad Apple as a "demo benchmark" along side Doom as attempting to display it on as much hardware as possible.