January 18th, 2020

"All they that take the sword..."

It's one of the most-quoted lines of the Bible, although it's usually slightly misquoted as "Whoever lives by the sword, dies by the sword", when in fact the original quote from the Bible is "All they that take the sword shall perish with the sword". The line can be interpreted in a few different ways. Most people simply interpret it as meaning that people who tend to propagate violence through their lives are likely to die as a result of violence. Of course, this is a general life principle rather than a scientific fact, because there are plenty of people who lived violent lives but died peacefully, and people who lived peacefully but died violently. When I've used the line in the past, however, I've tended to use it to mean the more general statement: "You reap what you sow". Whatever you end up doing, saying, and thinking with your life is likely to shape what the rest of your life is going to look like.

I thought about this recently because I picked up Atlas Shrugged again (yes, I'm still trying to slog through it, although I'm getting close to the end) and I came to the part where the following passage got me thinking:

I want you to observe... that those who cry the loudest about their disillusionment, about the failure of virtue, the futility of reason, the impotence of logic--are those who have achieved the full, exact, logical result of the ideas they preached, so mercilessly logical that they dare not identify it. In a world that proclaims the non-existence of the mind, the moral righteousness of rule by brute force, the penalizing of the competent in favor of the incompetent, the sacrifice of the best to the worst--in such a world, the best have to turn against society and have to become its deadliest enemies. Society... have achieved everything they advocated. What complaint do they now have to make? That the universe is irrational? Is it?


Astra Taylor's 2008 documentary film Examined Life features eight present-day philosophers speaking about--apparently--whatever they happened to feel like talking about, but there's a moment at the very end of the film, literally the last thing the film speaks about (the conversation starts 60 seconds before the credits roll), which seems to address a sort of elephant in the room: The question of meaning and meaninglessness. The scene features Cornel West riding in the back of a car being driven by Taylor, and Taylor asks: "One question that keeps coming up, or a phrase, is the idea of a meaningful life. Do you think it is philosophy's duty to speak on this--on how to live a meaningful life? Is that even a relevant... is that even an appropriate question for a philosopher?" West answers in a tone that acknowledges this as a familiar question: "I think it is. I think the problem of meaning is very important. Nihilism is a serious challenge. Meaninglessness is a serious challenge. Even making sense of meaninglessness is itself a kind of discipline and achievement. The problem is, of course, you never reach it. You know, it's not a static, stationary telos, or end, or aim. It's a process. One never reaches it. It's Sisyphean. You're going up the hill, looking for better meaning, or grander, more ennobling, enabling meaning, but you never reach it. In that sense, you die without being able to have the whole, in the language of romantic discourse."

I quote all of this because it seems like these two thinkers came to very different conclusions. Both acknowledge nihilism--the sense that everything is meaningless, that there is no such thing as reason--as a known and serious problem, but they seem to have reached different conclusions: Ayn Rand was a steadfast believer in the power of reason, certain beyond a doubt that people who stayed true to their principles would achieve great things with their lives, while West seems a bit more defeated or defeatist regarding the question, acknowledging that we never really understand the fullness of meaning, that we spend our lives pursuing meaning but, in the end, we die without an apparent sense of whether anything was really important or meaningful at all. Was Shakespeare right after all? Is life really nothing but a lot of sound and fury, signifying nothing?

It's tempting to get behind Rand's thinking, to conclude that people who see life as meaningless have reached that conclusion because they believed in it from the beginning, that they set out on life's journey with the assumption that life was meaningless, and that confirmation bias gave them plenty of opportunity to gather evidence to support this claim. All they that take the sword... If your sword be nihilism and meaninglessness, then your life will be meaningless through your own fault, through your own thoughts and words and actions. It's tempting to blame these people for their own meaninglessness, to say that they brought it all upon themselves because they rejected meaning whenever it was available to them. If you reject the meaning that is in your life, then you have no one but yourself to blame if your life ends up being pointless... right?

If only it were that simple. How many times have people in the world wholeheartedly invested their time, attention, and efforts into some cause, only for the whole thing to fall apart? How many families fell apart despite the best and most sincere efforts of everyone involved to keep the family unified? How many businesses failed despite desperate efforts by all the employees to keep the business afloat? How many works of art remained unfinished or were poorly-received despite the best efforts of the artists to produce the very best work they could produce? How many countries and nations fell apart even though their citizens believed in their beloved nation and fought for national unity with all of their hearts, minds, and bodies? How many times have people fought for what was meaningful to them, only to find that everything they believed in and everything they fought for was hollow and temporary? You can't say that people whose lives collapsed into failure and meaninglessness were people who believed that everything was meaningless from the start; that is not true, and has never been true.

It's tempting to think that people who work together with others and promote a spirit of cooperation will get good results, but this is simply not the case. Countless times, I've seen people with decades of job experience train new people on how to do the job, only to later lose their jobs because those new people could do the job just as well and would take a lower salary. I've been in group interviews where several candidates were interviewed for a job at once, and I suppose there was supposed to be a shared spirit of brotherhood and camaraderie between us, but it was really more like a spirit of tense competition, because we knew that we were, in a very real sense, each other's opponents. That sense of competition doesn't stop even when people get hired: No matter how well a company is doing, the time will come when profits sag and people start to get let go, and everyone knows that this will happen eventually, so it's in everyone's interests to be better than your co-workers, to allow your co-workers to fail so that you can keep your job. I've seen many people who were always very cooperative and helpful at work lose their jobs because their cooperativeness allowed other people to take their work. Similarly, I've seen people who kept secrets at work secure their employment because the company knew that that person was too valuable to be let go. We'd like to believe that a spirit of cooperation makes things easier for everyone, but actually, in the real world, job security goes to the people who are the least cooperative and make sure that they alone are capable of doing their work.

Do we really want to live in that kind of a world? A world where the cruelest are rewarded for their cruelty, and the kindest are slaughtered for their weakness? A world where everyone has the binary decision to take the sword or die by it? Rand believed that intelligence would lead people out of a life in which brute force was the only thing that gave people power to live, but in reality, intelligence doesn't really mean anything; it doesn't actually get you anywhere, especially today, when computers which are more intelligent than human beings can be used by anyone. People like to imagine that humans are less murderous and brutish than they were thousands of years ago, but today, things are arguably even worse, because if you fire someone or cause them to lose their job, this is a death sentence, and instead of allowing that person to die quickly, they must die slowly and agonizingly, looking for work and money and food in a way that destroys their mind and body. There are too many people and not enough ways for people to earn a living. The world has more people than it has ever had before, and machines have automated more types of work than ever before. This process cannot continue indefinitely.

The people who most suffer from nihilism and a lack of meaning in life are very often not the people who wanted to believe that life was meaningless in the first place. Rather, they are the people who had the most life experience and who saw life most clearly for what it is.

If you look at the people who still most strongly believe in something, it's clear that they only believe because they have lost touch with reality: I do know people who still believe in a particular idea, but usually it's not because that idea has led them to any kind of success, but rather because they fanatically believe in their pet idea despite all evidence to the contrary.

The rest of us are getting tired of pushing that stone up that hill.

Two 7400-centric projects: One step forward, one step back?

UPDATE: Drass, the designer of the C74-6502, has come through with a schematic of the two main cards in the device, as well as a complete microcode listing. Good things come to those who wait. You can find the links to the relevant files at https://c74project.com/c74-6502-internals/. Thanks to Drass for stopping by here to let us know that the documentation is available!

Long-time readers of my blog may remember that some time ago, I made some fairly abortive efforts toward a project to build a 6502-compatible CPU entirely out of 7400-series logic chips. The project never really went anywhere, partly because I didn't really know what I was doing: Although I have some background in electrical and electronic engineering, the original 6502 was built in such a way that it cannot easily be reproduced using digital logic alone, because a lot of its critical timing and logic circuitry rely on analog factors like the size of transistors used, things which are outside of your control if you're using pre-built logic ICs. I still feel like the idea of such a project is very important, because today, it's difficult to buy actual 6502 chips, and so being able to recreate the functionality of a 6502 using standard logic structures would go a long way toward preserving human understanding of how these devices worked and how they can be rebuilt if necessary. What finally caused me to abandon the project was the realization that even if I could somehow design and build such a project in reality, it would be too slow: Most computers that were built around the 6502 had timing-critical functions which expected the CPU to be able to run at a speed of one megahertz, and a CPU built using discrete logic chips would probably not be able to run at this speed, because the length of the wires involved would introduce impedance that would necessarily slow down the signals between the different parts of the CPU. The project that made me realize this was the MOnSter 6502, a great-looking project which is a working 6502 built out of discrete transistors. While the project looks wonderful and actually works, it suffers from one critical problem: It has a maximum speed of about fifty kilohertz because of the propagation delays introduced by using discrete transistors instead of transistors built onto an IC wafer. When I saw this project, I came to the conclusion that while some great work had gone into this project, it just wasn't possible for people working at home without a semiconductor foundry to produce full-speed CPUs.

Or so I thought. Then I became aware of the C74-6502 CPU. This is a CPU which is not only fully compatible with the 6502, but also with the 6510, the CPU used in the Commodore 64. And it's built entirely out of 7400-series logic chips. Someone has succeeded where I failed: Someone built a fully 6502- and 6510-compatible CPU entirely out of standard 7400-series chips, fit the entire structure onto a couple of compact circuit boards joined together through pin headers, fit a ribbon cable onto the device which you can actually plug into any 6502 or 6510 socket so you can use the C74-6502 as a true drop-in replacement for a real 6502 or 6510, and somehow managed to make the whole thing capable of running at twenty megahertz! Not only is this device capable of running any computer that was contemporary for the time when the 6502 came out, it can run at twenty times the original speed, meaning you could theoretically make a computer which is fully compatible with the computers that used the 6502 and 6510--including the Apple I and II, Commodore 64, Atari 400, 800, and 2600, VIC-20, and the original NES--and make it twenty times faster than the original.

But who's still making computers based on the 6502? Why, Brad Graham is: He's developed the Vulcan-74, a computer built entirely out of a 6502 CPU and 7400-series logic chips; the only other ICs in the system are SRAM chips for memory. In many ways, the Vulcan-74 is the perfect complement to the C74-6502 CPU, because one is a 6502 CPU made entirely out of 7400-series chips, and the Vulcan-74 is a computer made entirely out of a 6502 and 7400-series chips, meaning if you put these two projects together, you really have the combined knowledge of how to create a fully-functional computer using nothing but standard logic structures. This would solve the problem of humans potentially forgetting how to make devices like this; as long as you know how to make AND, OR, and NAND gates, you can pretty much put these things together from scratch. (Okay, that might be a bit of an oversimplification, but it's not too far from the truth.)

There's just one problem with both the C74-6502 and the Vulcan-74: Neither of them offer schematics as of this writing. Both projects' websites contain nice-looking photos of the devices in action and fairly superficial technical details, but neither website includes an actual circuit diagram. Instead, the creators of both projects promise that circuit schematics will be on their way, as soon as they have time to release them.

In both cases, I kind of feel like the developers are being a bit disingenuous when they claim that time constraints prevent them from releasing the schematics for both devices. The amount of time that went into designing and building both devices must have been considerable, and surely both people drew up fairly detailed diagrams before actually going through the process of physically assembling anything. Yet after going through the process of creating fairly polished-looking websites full of photographs and descriptions of what these devices can do, the developers now claim that they don't have the time to create hardware diagrams, which I am certain must already exist in a reasonably complete form, since no one builds a device like this without making design diagrams first.

This is particularly frustrating when one considers that neither of these devices is really that complex. The C74-6502 is capable of running at 20 MHz partly because instead of using the more common 74LS (low-power Schottky) chips, it's made with 74AC (advanced CMOS) chips, although the datasheet notes that 74ACT (advanced CMOS TTL-compatible) and 74HCT (high-speed CMOS TTL-compatible) chips can be used if TTL compatibility is desired. This is a simple matter of buying different, more expensive chips to get a speed boost. Admittedly, I'm once again somewhat diminishing the amount of work that went into achieving this result, because achieving 20 MHz speed also required pre-fetching microcode which is by no means simple to write, but it doesn't seem like describing the basic structure of the C74-6502 should be that difficult. Drass, the creator of the project, documented the process of creating this project in a fairly lengthy forum thread on 6502.org, but forum threads are a poor way to document projects like this, because they're full of other people's posts and extraneous information which is not really important to the final build. If you go through that forum thread, you'll find a lot of information which sort of tells you how the device was put together, but proper documentation for people who want to understand how the device works would be nice. Drass has promised that such documentation will eventually be available here, but for now, the only document there is a block diagram and fairly high-level datasheet; the real meat of the data, including circuit schematics, PCB layouts, Gerber files, component BOMs, and ROM listings are promised for the future but currently unavailable.

Similarly, Brad Graham's website about the Vulcan-74 includes a lot of information, except the information that people actually want to see. There are pages of information about how to join breadboards together, cut and strip wire for use on breadboards, connect RAM chips to CPUs, and add I/O circuitry to a CPU, despite the fact that these are all fairly simple and standard processes which many basic tutorials have already documented. While it doesn't hurt for Graham to document these in his own words, what people would probably really like to see is the VGA circuit which allows the Vulcan-74 to output 400×300 video to a standard 15-pin VGA connector. This is the Vulcan-74's "money shot", the really impressive feat that Graham pulled off with his project, and it seems to be the one thing that isn't documented on his website. Five years ago, in a thread on hackaday.com, Graham promised "a dedicated website for this system, which will include schematics, PCB files, and a downloadable 300-400 page book on how every single part of the system works". At the time, Graham even produced a YouTube video featuring a visual look at the VGA circuit and its output, but unless I'm really missing something, hard technical information on how this circuit works is still absent now, five years later. The Vulcan-74 also has its own lengthy forum thread on 6502.org, again going through a lot of details of how the device was built as time goes by, but no complete set of documentation at the project's end.

I really don't want to criticize these people or what they've done, because these are tremendously impressive projects that are the results of literally years of diligent research and development, and furthermore, these people are under no obligation to myself or to anyone else; they built these projects in their own spare time, using their own money and effort, and were generous enough to provide glimpses of their work and its fruits for the Internet to see. All of these people are busy people who have their own real lives which they must deal with every day; presumably they have jobs they need to work, families to which they wish to devote precious spare time, and other hobbies and interests which they would like to indulge in. All of this is perfectly understandable and reasonable, but that having been said, there's something about the attitude toward documenting these projects which bothers me at some fundamental level. The MOnSter 6502, the C74-6502, and the Vulcan-74 all have these things in common: They are all great projects for which the designers made short demo videos on YouTube and a basic website to provide an overview of what the project is, but they never got around to releasing actual circuit diagrams or schematics, even though those diagrams likely already exist in more or less usable forms.

It's difficult for me to express just how disturbing I find this. I feel like the most important part of these projects is documenting how they work so that other people can see what was done and learn from these details. Seeing a brief YouTube video of a circuit board plugged into something is cool, but anybody who finds that kind of thing interesting would probably want a whole other level of technical information about the project than just seeing that it can plug into a monitor and run a graphic demo. To my mind, if you ever do anything that might be interesting to other people, it's worth writing up some details about exactly how you did it and releasing those details to other people so that they can benefit from what you did. That's the open-source principle, after all; when people don't have details about how something was done, they need to waste time reinventing the wheel by trying to figure out what was done when they could learn from the past if only they had the information necessary to do so.

In a larger sense, this seems to reflect a particular difference in attitudes toward documentation between people. I try to be fairly verbose when I write, because I feel like that verbosity provides clarity to others. If you skip over important details, you just muddy the waters for everyone else. Other people, however, probably find my writing style bloated and overly detailed. It's tempting to think of this as a generational difference--perhaps younger people are less interested in writing and more interested in doing cool things quickly and moving on rather than spending time on documenting what they did--but Brad Graham is slightly older than I, so I don't think this is just a matter of age; it's more a question of attitude. There is a lot that you can see and do in this world, and if you're caught up in the magic and wonder of exploring all of life's possibilities, then you probably don't have a lot of time to describe what you did after the fact because you're too busy running off to the next thing you want to do, but if you've done something that might interest other people, it's worth taking some time to describe how you did it so that other people can try to do it too, and perhaps improve on your design. If you don't, other people will try to reproduce your work in ways that might not be as good as what you achieved, because they didn't realize some key insight which you had during your project which made things turn out better than anyone would have expected, like making a CPU out of 7400-series chips that runs at 20 MHz, or a computer on breadboards that can output high-resolution VGA graphics. These are impressive feats, but if people don't understand how they were done, these achievements will be lost to the sands of time. It's for this reason that I can't help but feel like these projects, laudable though they are, represent one step forward, but one step back: Yes, they are impressive, but most people aren't going to be able to achieve the same things without some pointers which are currently lacking.

All of this seems to reflect a shift in attitudes toward understanding technology: I've noted before that when I was a child, it was generally considered important for people to be able to understand how computers worked. My first home computer, an Apple IIgs, came with Apple's A Touch of Applesoft BASIC, a manual describing how to program the computer using its built-in BASIC interpreter. How many Apple computers come with similar manuals today? These days, even technical workers are not expected to understand how computers work; even software developers create software by copying-and-pasting a handful of functions into a program and hoping that those functions do what they want, because if the plug-in functions don't work as expected, nobody really understands why, or has the ability to do anything without using pre-built functions. I, as a technology worker, have even been criticized at work for trying to understand how things work: I was told that trying to understand information systems takes too long and is a waste of time, and therefore if things work as expected, I shouldn't waste my time by trying to analyze those systems. This seems like a foolish way of thinking to me, but this is how technology companies handle their technology today: If things work, even engineers shouldn't care about how they work, because understanding doesn't bring in business.

I could theorize that all of this willful ignorance is going to get humanity into a lot of trouble someday, when we no longer understand how the systems we've built work but desperately need to understand them. I could try to encourage people to understand things more, to not be content with a superficial "It looks like it's working, so why should we bother thinking about it?" attitude. I think, however, that I have made my ideas and opinions on these matters pretty apparent by now, not only in this post but also in countless others I've made in the past, and so to avoid giving any more grief to the brilliant hackers who have produced projects far beyond what I could ever have produced myself, I will simply conclude here by urging folks who work on projects like this once more: Please, folks, explain what you've done so that other hackers can expand upon your work. The future of humanity will benefit from it.