PodcastsEducationHacker Public Radio

Hacker Public Radio

Hacker Public Radio
Hacker Public Radio
Latest episode

180 episodes

  • Hacker Public Radio

    HPR4560: Arthur C. Clarke: Other Works, Part 2

    2026/1/23
    This show has been flagged as Clean by the host.

    This brings us to a look at some of Arthur C. Clarke's other stories, A Time Odyssey (1951), Tales From the White Hart (1957), The Nine Billion Names of God (1954), The Star (1955), Dolphin Island (1964), and A Meeting With Medusa (1971. These stories will wrap up our look at Clarke's Science Fiction and we have seen a lot of good stuff here. And as a final note, we cover CLarke's Three Laws.

    Arthur C. Clarke: Other Works, A Time Odyssey

    A collaboration between two of science fiction’s best authors: what could possibly go wrong? Well, something went wrong. This series is not bad, but I hesitate to describe it as good. This series was described by Clarke as neither a prequel nor a sequel, but an “orthoquel”, a name coined from “orthogonal”, which means something roughly like “at right angles”, though it is also used in statistics to denote events that are independent and do not influence each other. And in relativity theory Time is orthogonal to Space. And in multi-dimensional geometry we can talk about axes in each dimension as orthogonal to all of the others. It is something I can’t picture, being pretty much limited to three dimensions, but it can be described mathematically.

    It is sort of like the 2001 series, but not really. It has globes instead of monoliths. And the spheres have a circumference and volume that is related to their radius not by the usual pi, but by exactly three. Just what this means I am not sure, other than they are not sphere’s in any usual sense of the word. In this story these spheres seem to be gathering people from various eras and bringing them to some other planet which gets christened “Mir”, though not in any way to the Russian Space Station. It is a Russian word that can mean “peace”, “world”, or “village”. I have seen it used a lot to refer to a village in my studies of Russian history. Anyway, the inhabitants include two hominids, a mother and daughter, a group of British Redcoats, Mongols from the Genghis Khan era, a UN Peacekeeper helicopter, a Russian space capsule, an unknown Rudyard Kipling, the army of Alexander The Great… Well at least they have lots of characters to throw around. They end up taking sides and fighting each other. In the end several of the people are returned to Earth in their own time.

    But the joke is on them. The beings behind the spheres are call themselves The Firstborn because they were the first to achieve sentience. They figure that best way for them to remain safe is to wipe out any other race that achieves sentience, making them to polar opposite of the beings behind the monoliths in 2001, for whom the mind is sacred. Anyway, the Firstborn have arranged for a massive solar flare that will wipe out all life on Earth and completely sterilize the planet, but conveniently it will happen in 5 years, leaving time for plot development. Of course the people of Earth will try to protect themselves. Then in the third book of the series an ominous object enters the solar system. This is of course a callback to the Rama object. It is like they wanted to take everything from the Rama series and twist it.

    While I love a lot of Clarke’s work and some of Baxter’s as well, I think this is eminently skippable. The two of them also collaborated on the final White Hart story, which isn’t bad
    Other Works
    Tales from the White Hart

    This collection of short stories has a unity of the setting, a pub called White Hart, where a character tells outrageous stories. Other characters are thinly disguised science fiction authors, including Clarke himself. Clarke mentions that he was inspired to do this by the Jorkens stories of Lord Dunsany, which are also outrageous tall tales, but lacking the science fictions aspects of Clarke’s stories. Of course this type of story has a long history, in which we would do well to mention the stories of Baron Munchausen, and of course the stories of L. Sprague de Camp and Fletcher Pratt as found in Tales from Gavagan’s Bar. And Spider Robinson would take this basic idea and turn it into a series of books about Callahan’s Place. Stories of this type are at least as much Fantasy as anything, but quite enjoyable, and I think I can recommend all of these as worth the time to while away a cold winter’s evening while sitting by a warm fire with a beverage of choice.
    The Nine Billion Names of God

    This short story won a retrospective Hugo in 2004 as being the best short story of 1954. The idea is that a group of Tibetan monks believe that the purpose of the universe is to identify the nine billion names of God, and once that has been done the universe will no longer have a purpose and will cease to exist. They have been identifying candidates and writing them down, but the work is very slow, so they decide that maybe with a little automation they can speed it up. So they get a computer (and in 1954, you should be picturing a room-sized mainframe), and then hire some Western programmers to develop the program to do this. The programmers don’t believe the monks are on to anything here, but a paycheck is a paycheck. They finish the program and start it running, but decide they don’t want to be there when the monks discover their theory doesn’t work, so they take off early without telling anyone, and head down the mountain. But on the way, they see the stars go out, one by one.
    The Star

    This classic short story won the Hugo for Best Short Story in 1956. The story opens with the return of an interstellar expedition that has been studying a system where the star went nova millennia ago. But the expedition’s astrophysicist, a Jesuit Priest, seems to be in a crisis of faith. And if you think it implausible that a Jesuit Priest could also be an astrophysicist, I would suggest you look into the case of the Belgian priest Georges Lemaître, who first developed the theory of the Big Bang. Anyway, in the story, they learn that this system had a planet much like Earth, and it had intelligent beings much like Earth, who were peaceful, but in a tragic turn of events they knew that their star was going to explode, but they had no capability of interstellar travel. So they created a repository on the outermost planet of the system that would survive the explosion, and left records of their civilization. And when the Jesuit astrophysicist calculated the time of the explosion and the travel time for light, he is shaken:

    “[O]h God, there were so many stars you could have used. What was the need to give these people to the fire, that the symbol of their passing might shine above Bethlehem?”
    Dolphin Island

    This is a good Young Adult novel about the People of the Sea, who are dolphins. They save a young boy who had stowed away on a hovership that subsequently had crashed, and because no one knew about him he was left among the wreckage when the crew takes off in the life boats. And from here it is the typical Bildungsroman you find in most Young Adult novels. The dolphins bring him to an island, where he becomes involved with a research community led by a professor who is trying to communicate with dolphins. He learns various skills there, survives dangers, and in the end has to risk his life to save the people on the island. If you have a 13 year old in your house, this is worth looking for.
    A Meeting With Medusa

    This won the 1972 Nebula Award for Best Novella. It concerns one Howard Falcon, who early in the story has an accident involving a helium-filled airship, is badly injured, and requires time and prosthetics to heal. But then he promotes an expedition to Jupiter that uses similar technology, a Hot-Hydrogen balloon-supported aircraft. This is to explore the upper reaches of Jupiter’s atmosphere, which is the only feasible way to explore given the intense gravity of this giant planet. Attempting to land on the solid surface would mean being crushed by the gravity and air pressure, so that is not possible. The expedition finds there is life in the upper clouds of Jupiter. Some of it is microscopic, like a kind of “air plankton” which is bio-luminescent. But there are large creatures as well, one of which is like jellyfish, but about a mile across. This is the Medusa of the title. Another is Manta-like creature, about 100 yards across, that preys on the Medusa. But when the Medusa starts to take an interest on Falcon’s craft, he decides to get out quick for safety’s sake. And we learn that because of the various prosthetics implanted after the airship accident Falcon is really a cyborg with much faster reactions than ordinary humans.

    As we have discussed previously, Clarke loved the sea, and in this novella he is using what he knows in that realm to imagine a plausible ecology in the atmosphere of Jupiter. Of course when he wrote this novella no one knew about the truly frightening level of radiation around Jupiter, but then a clever science fiction writer could come up with a way to work around that.
    Clarke’s Three Laws

    Finally, no discussion of Arthur C. Clarke can omit his famous Three Laws. Asimov had his Three Laws of Robotics, and Clarke had his Three Laws of Technology.

    When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
    The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
    Any sufficiently advanced technology is indistinguishable from magic.
    This concludes our look at Arthur C. Clarke, the second of the Big Three of the Golden Age of Science Fiction. And that means we are ready to tackle the Dean of Science Fiction, Robert A. Heinlein.

    Links:

    https://en.wikipedia.org/wiki/A_Time_Odyssey

    https://en.wikipedia.org/wiki/Tales_from_the_White_Hart

    https://en.wikipedia.org/wiki/Joseph_Jorkens

    https://en.wikipedia.org/wiki/Baron_Munchausen

    https://en.wikipedia.org/wiki/Tales_from_Gavagan%27s_Bar

    https://en.wikipedia.org/wiki/Callahan%27s_Crosstime_Saloon

    https://en.wikipedia.org/wiki/The_Nine_Billion_Names_of_God

    https://en.wikipedia.org/wiki/The_Star_(Clarke_short_story)

    https://en.wikipedia.org/wiki/Dolphin_Island_(novel)

    https://en.wikipedia.org/wiki/A_Meeting_with_Medusa

    https://en.wikipedia.org/wiki/Clarke%27s_three_laws

    https://www.palain.com/science-fiction/the-golden-age/arthur-c-clarke/arthur-c-clarke-other-works/

    Provide feedback on this episode.
  • Hacker Public Radio

    HPR4559: Enkele off line vertaaltools

    2026/1/22
    This show has been flagged as Clean by the host.

    Offline Translator tools

    Translate text offline

    LocalTranslate is an offline translation application that uses Firefox's neural translation models (from the mozilla/firefox-translations-models project) to perform high-quality translations locally on your device.

    Note: LocalTranslate is not affiliated with The Mozilla Foundation in any way.

    Links

    LocalTranslate by Shriram Ravindranathan on flathub.org

    GPL-3.0 license

    Source Code

    Offline Translator - On-device translation of text and images

    A translator app that performs on-device translation of text and images without sending your data to external servers.

    Features:

    On-device translation using Mozilla's translation models

    Transliteration of non-latin script

    OCR (Optical Character Recognition) for translating text in images

    Automatic language detection

    Image translation overlay that preserves original formatting

    Support for multiple language pairs

    No internet required for translation once models are downloaded

    All translation happens locally

    Links

    Offline Translator by David Ventura on F-droid

    [GNU General Public License v3.0 or later](
    https://spdx.org/licenses/GPL-3.0-or-later.html

    Source Code

    hpr3315 :: tesseract optical character recognition

    Provide feedback on this episode.
  • Hacker Public Radio

    HPR4558: YouTube Subscriptions 2025 #14

    2026/1/21
    This show has been flagged as Clean by the host.

    I am subscribed to a number of YouTube channels, and I am sharing them with you.

    Links:

    https://www.youtube.com/@bulwarkmedia

    https://www.youtube.com/@thefabfaux

    https://www.youtube.com/@TheGreatWar

    https://www.youtube.com/@TheHistoryGuyChannel

    https://www.youtube.com/@TheImmedFamily

    https://www.youtube.com/@TheKoreanWarbyIndyNeidell

    https://www.youtube.com/@TheLanguageTutor

    https://www.youtube.com/@TheLincolnProject

    https://www.youtube.com/@planetarysociety

    https://www.youtube.com/@TheSaxyGamer

    https://www.youtube.com/@JSHIPLIFE

    https://www.youtube.com/@thespiffingbrit

    https://www.youtube.com/@AmyShiraTeitel

    https://www.youtube.com/@thefrielsisters

    https://www.palain.com/

    Provide feedback on this episode.
  • Hacker Public Radio

    HPR4557: Why I prefer tar to zip

    2026/1/20
    This show has been flagged as Clean by the host.

    Why I prefer tar to zip

    I love having choices when it comes to computing, and especially in the world of open source we’re spoilt when it comes to archiving files. There’s TAR, ZIP, GZIP, BZIP2, XZ, 7Z, AR, ZOO, and more. Of all compression formats, it seems that ZIP has gained ubiquity. It’s the one you can use to archive and extract data on nearly every system, including Linux, UNIX, FreeDOS, Android, Windows, macOS, and more. The problem is, ZIP isn’t the best tool for the job of archival. Here’s why I use TAR instead of ZIP whenever possible.

    Each archiving format has an associated command, such as tar, zip, gzip and gunzip, xz, and so on. In terms of compression, they all tend to be basically the same at this point. You might save a few kilobytes or megabytes with one compression algorithm given a specific combination of file types, but it’s fair to say that they all result in broadly similar results. Where they differ is in what each command makes available, and what each file format retains.

    The tar and zip command showdown

    At first glance, tar and zip are similar in capability.

    By default, the tar command generates an archive that’s not compressed. It’s just a single file object that contains smaller file objects within it. The resulting object is basically the same size as the sum of its parts:

    $ tar --create --file archive.tar pic.jpg file.txt
    $ ls -lG
    -rw-r--r-- 1 tux 46049280 Jan 7 10:55 archive.tar
    -rw-r--r-- 1 tux 45965374 Jan 7 10:55 file.txt
    -rw-r--r-- 1 tux 77673 Jan 7 08:34 pic.jpg
    You can use the -0 option to simulate this with the zip command:

    $ zip -0 archive.zip pic.jpg file.txt
    adding: pic.jpg (stored 0%)
    adding: file.txt (stored 0%)
    $ ls -lG
    $ ls -lG
    -rw-r--r-- 1 tux 46049280 Jan 7 10:55 archive.tar
    -rw-r--r-- 1 tux 46043355 Jan 7 10:57 archive.zip
    -rw-r--r-- 1 tux 45965374 Jan 7 10:55 file.txt
    -rw-r--r-- 1 tux 77673 Jan 7 08:34 pic.jpg
    The most common use case of each command, however, definitely includes compression.

    Level of compression

    The balance in choosing either an algorithm (in the case of tar) or a compression level (in the case of zip is between compression speed and size. In theory, the slower you let the command compress, the smaller the resulting archive. The faster the compression, the bigger the archive.

    Both commands strive to provide you with some control over this.

    By default (without the -0 option), the zip command also compresses the archive it has created. You can adjust the amount of compression with an option ranging from -0 to -9. The default level is -6.

    To add compression to the tar command, you can either use a separate command entirely to compress the resulting TAR file, or you can one of several options to choose what compression algorithm gets applied to the TAR file during its creation. Here’s an incomplete list:

    -z or --gzip: Filters the archive through gzip

    -j or --bzip2: Filters the archive through bzip2

    -J or --xz: Filters the archive through xz

    --lzip: Filters the archive through lzip

    -Z or --compress: Filters the archive through compress

    --zstd: Filters the archive through zstd

    --no-auto-compress: Prevents tar from using the archive suffix to determine the compression program so you can specify one (or not) yourself

    Decoupling the process of archiving from compression makes sense to me. While the zip command is stuck with basically the same old algorithm year after year, a TAR archive can be compressed using whatever compression algorithm you think is best. In some cases, you might make that determination based on the type of data you’re compressing, or you might be limited to the capabilities of your target system, or you might just want to test a hot new compression algorithm.

    Here’s what the zip command does with a 44 MB text file and a JPEG file, at maximum compression:

    $ zip -9 archive.zip file.txt pic.jpg
    adding: file.txt (deflated 90%)
    adding: pic.jpg (deflated 14%)
    $ ls -lG
    -rw-r--r-- 1 tux 4.4M Jan 7 11:17 archive.zip
    -rw-r--r-- 1 tux 44M Jan 7 10:55 file.txt
    -rw-r--r-- 1 tux 76K Jan 7 08:34 pic.jpg
    A compressed archive of 4.4 MB down from a little more than 44 MB isn’t bad.

    Similarly, the tar command with the --gzip option produces a 4.5 MB archive. However, filtering tar through --xz makes a significant improvement:

    $ tar --create --xz --file archive.tar.xz file.txt pic.jpg
    $ ls -lG
    -rw-r--r-- 1 tux users 3.3M Jan 7 11:17 archive.tar.xz
    -rw-r--r-- 1 tux users 44M Jan 7 10:55 file.txt
    -rw-r--r-- 1 tux users 76K Jan 7 08:34 pic.jpg
    At 3.3 MB, it seems that a newer compression algorithm has outperformed ZIP, at least in this particular test. I’m the first to admit that compression tests are subject to many variables, so it’s not globally significant that XZ has done better than ZIP in this one example. With some experimentation, I could [probably] devise a test that gets better results from ZIP. However, this example does demonstrate that it’s useful having an archive tool that is modular enough to allow for the development of new algorithms.

    Output manipulation

    When you extract data from a TAR or ZIP archive, you can choose to either extract specific files or to extract everything all at once. I believe it’s most common to extract everything, because that’s the default behaviour on major desktops like GNOME and macOS. With both the tar and unzip commands, even when you choose to extract everything all at once, you still have a choice of where to put the files you’ve extracted.

    By default, both the tar and unzip commands extract all files into the current directory. If the archive itself contains a directory, then that directory serves as a “container” for the extracted files. Otherwise, the files appear in your current directory. This can get messy, but it’s a common enough problem that Linux and UNIX users call it a “tarbomb” because it sometimes feels like an archive has exploded and left file shrapnel in its wake.

    However, a tarbomb (or zipbomb) isn’t inherently bad. It’s a valid use case when you want to essentially overlay updated or additional files into an existing file system. For example, suppose you have a website consisting of several PHP files across several directories. You can take a copy of the site to your development machine to make updates, and then create an archive of the files you’ve updated. Extract the archive on your web server, and each new version of any file is extracted exactly where it originated from because both tar and unzip retain the filesystem’s structure. I use this feature when doing dot-release updates of several different content management systems, and it makes maintenance pleasantly simply.

    Both the unzip and tar commands provide an option to change directory before extraction so you can store an archive in one directory but send extracted files to a different location.

    Use the --directory option with the tar command:

    $ mkdir mytar
    $ tar --extract --file archive.tar.xz --directory ./mytar
    $ ls ./mytar
    file.txt pic.jpg
    Use the -d option with unzip:

    $ mkdir myzip
    $ unzip archive.zip -d ./myzip
    $ ls ./myzip
    file.txt pic.jpg
    The feature unzip doesn’t have is the ability to drop directories from the archive before extraction. For example, suppose you want to extract files directly into myzip, but you’ve been given an archive containing a leading directory called chaff:

    $ unzip archive+chaff.zip -d ./myzip
    $ ls ./myzip
    chaff
    $ ls ./myzip/chaff
    file.txt pic.jpg
    You don’t want chaff, but there’s no option in unzip to skip it.

    Frustratingly, the unzip command essentially encourages this anti-pattern. In order to avoid delivering a zipbomb to someone, you thoughtfully nest your files in a useless folder. But by nesting everything in a useless folder, you’ve also prevented your user from extracting only the files required.

    The tar command solves this problem elegantly. You can protect your users from a tarbomb by nesting your files in a useless directory because tar allows any user to skip over any number of leading directories.

    $ tar --extract --strip-components=1 \
    --file archive+chaff.tar.xz --directory ./mytar
    $ ls ./mytar
    file.txt pic.jpg
    Permission and ownership

    The ZIP file format doesn’t preserve file ownership. The TAR file format does.

    You might not notice this when using ZIP or TAR archives just on your own personal systems. Once a file is extracted, you own the file. However, using tar as a superuser or with the --same-owner option extracts each file with the same ownership it had when archived, assuming the same user and group is available on the system. There’s no option for that with unzip command because the ZIP file format doesn’t track ownership.

    The zip command can preserve file permissions, but again tar offers a lot more flexibility. The --same-permissions, --no-same-permissions, and --mode options let you control the permissions assigned to archived files.

    Better archiving with tar

    It’s easy to use either ZIP or TAR interchangeably, because for most general purpose activities their default behaviour is similar and suitable. However, if you’re using archives for mission critical work involving disparate systems and a diverse set of people, TAR is the technically superiour choice. Whether TAR is the “correct” choice depends entirely on your target audience, because there’s no doubt that ZIP has greater support. But all things being equal, TAR is the archive format and tar is the archive command I prefer.

    Show notes taken from https://www.both.org/?p=13268
    Provide feedback on this episode.
  • Hacker Public Radio

    HPR4556: Nitro man! RC Cars

    2026/1/19
    This show has been flagged as Clean by the host.

    Today it's a special Christmas episode, it's such a kind of part for the RC cars.
    So we're going to have to talk to them about the Metro cars.
    The Metro cars are RC cars that run off of this like 20% oil gas thing.
    So the oil is in the gas, it has a little motor and you can pay you know $800 for a motor
    or you can pay, you know, the 50 bucks for a motor.

    https://traxxas.com/products/models/electric/rustler-bl2s

    Provide feedback on this episode.

More Education podcasts

About Hacker Public Radio

Hacker Public Radio is an podcast that releases shows every weekday Monday through Friday. Our shows are produced by the community (you) and can be on any topic that are of interest to hackers and hobbyists.
Podcast website

Listen to Hacker Public Radio, The Rich Roll Podcast and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features
Social
v8.3.0 | © 2007-2026 radio.de GmbH
Generated: 1/24/2026 - 7:08:06 PM