BardTech
Why Bother With Old Technology?

Why Bother With Old Technology?

That is the key question. In general, an "old" technology has to offer some kind of advantage to whomever is using it. For example, iron forging by hand has no general advantage over the modern steel mill. But the craftsperson who hand-forges a fence gate needs to know those skills, materials, and methods to produce a well-crafted piece.

In the computing world today, it is impossible for one person to "get their arms around" or fully comprehend an operating system, from high-level functions down to the operation of hardware. In fact, those are now specialized skills. But in the "old days" of much simpler hardware and software, knowledge of both was actually REQUIRED, as both were still in development, until a stable and generally-accepted OS and hardware platform was established. In fact the history of personal computing is a series of "established" platforms, one after another.

My general experience was based on my studies in college where I had classes relating to electrical engineering, and my tech experiences in repair. In both regimes I learned to look at fundamentals, and to look at things diagnostically.

Many "retrotechnology" skills and methods are based on fundamentals, things which change more slowly AND which have patterns, features or principles which repeat in other areas. These are things which offer advantages, when adapting to "new" skills or technologies today.

The use of forgotten skills and materials occurs when those offer some advantage or create an opportunity. If you are an unemployed computer engineer, you are not going to spend thousands of dollars on software development tools to make some widget to sell. Yet, the Internet lets you become a "craftsperson" and actually offer one-off bits of computing hardware, called in general "embedded computers".

A good example of this is in hobby robotics, where individuals and small companies offer small widgets all the time. The development platforms for those products are very simple.

Embedded hardware means computers which are unique to one area of use, like GPS systems or cell phones, or controllers for machines and robots, and so on. Since those areas of use keep changing, and since computer chips keep changing, there are few "established" platforms.

But todays embedded computing world is in conflict, between using "established" operating systems like Linux, Windows, and other companies OSs. These are HUGE development packages, of hundreds of megabytes of programs and files. And yet, these embedded computers may only have megabytes of program memory or even less- much like the "classic" computers of decades ago.

I see this as an opportunity to re-examine old tools for new use. At my university we had some older equipment that used to cost thousands of dollars and was purchased for hundreds at the time, so that we can rebuild and reprogram these systems in direct fashion, down to changing hardware and writing in assembler language - the binary language of microprocessors. Or just learning how to repair old hardware and make it work.

While a "nuts and bolts" level of microcomputing is not often called for today, I would argue that such knowledge is necessary for certain kinds of embedded design and development.

Indeed, there is growing interest in hobby robotics as a literal "nuts and bolts" environment. But there are also moves by Microsoft and other large companies, to introduce very complex and large software tools into robotics. They claim these are "efficient" tools, but their motives are simply to grab market share with tools that you cannot escape from.

Whereas, if you know the fundamentals, and your tools are fundamental, you can always use other tools or adapt YOUR tools for other purposes. Knowing the basics makes you flexible, and I would argue flexibility is still an advantage today.

Post was last modified: October 13 2021 02:16:27