mikewarot 2 hours ago

Transistors are generally at their lowest static power dissipation if the are either fully on or off. The analog middle is great if you're trying to process continuous values, but then you're going to be forced to use a bias current to hold on in the middle, which is ok if that's the nature of the circuit.

A chip with billions of transistors can't reasonably work if most of them are in the analog mode, it'll just melt to slag, unless you have an amazing cooling system.

Also consider that there is only one threshold between values on a binary system. With a trinary system you would likely have to double the power supply voltage, and thus quadruple the power required just to maintain noise margins.

  • throw10920 39 minutes ago

    This is great point, and I'll extend it by claiming that there's a more general physical principle underneath: that it's significantly easier to build bistable systems than tristable (or higher) systems, so much so that it makes up for the fact that you need more of them.

    This is far more general than electronic systems (e.g. quantum computers follow the same principle - it's far easier to build and control qubits than qutrits/qudits).

    (technically, it's even easier to build systems that have a single stable configuration, but you can't really store information in those, so they're not relevant)

  • foxglacier 42 minutes ago

    Wouldn't you also get data loss using the linear region of transistors? The output would be have some error from the input and it would propagate through the circuit, perhaps eventually reaching on or off where it would be stuck.

bastawhiz an hour ago

Trinary is an efficient way of storing lots of -1/0/1 machine learning model weights. But as soon as you load it into memory, you need RAM that can store the same thing (or you're effectively losing the benefits: storage is cheap). So now you need trinary RAM, which as it turns out, isn't great for doing normal general purpose computation with. Integers and floats and boolean values don't get stored efficiently in trinary unless you toss out power of two sized values. CPU circuitry becomes more complicated to add/subtract/multiply those values. Bitwise operators in trinary become essentially impossible for the average IQ engineer to reason about. We need all new IAs, assembly languages, compilers, languages that can run efficiently without the operations that trinary machines can't perform well, etc.

So do we have special memory and CPU instructions for trinary data that lives in a special trinary address space, separate from traditional data that lives in binary address space? No, the juice isn't worth the squeeze. There's no compelling evidence this would make anything better overall: faster, smaller, more energy efficient. Every improvement that trinary potentially offers results in having to throw babies out with the bathwater. It's fun to think about I guess, but I'd bet real money that in 50 years we're still having the same conversation about trinary.

alphazard 32 minutes ago

I've never understood the fascination here. Apparently some expression relating the number of possible symbols and the length of a message is closer to euler's number. I don't see why the product of those things is worth optimizing for. The alphabet size that works best is dictated by the storage technology, more symbols usually means it's harder to disambiguate.

2 is the smallest amount of symbols needed to encode information, and makes it the easiest to disambiguate symbols in any implementation, good enough for me.

  • kingstnap 4 minutes ago

    The idea is roughly that the effort needed to use a system is in some situations ~propto the number of symbols * the number of needed positions.

    Here's a concrete example, imagine you needed to create some movable type because you are creating a printing press. And you need to represent all numbers upto 100 million.

    In binary you need to make 53 pieces, in ternary 50, in octal 69 pieces, in decimal 81 and in hexadecimal 101.

bee_rider 3 hours ago

> Trinary didn’t make any headway in the 20th century; binary’s direct mapping to the “on”/”off” states of electric current was just too effective, or seductive; but remember that electric current isn’t actually “on” or “off”. It has taken a ton of engineering to “simulate” those abstract states in real, physical circuits, especially as they have gotten smaller and smaller.

But, I think things are actually trending the other way, right? You just slam the voltage to “on” or “off” nowadays—as things get smaller, voltages get lower, and clock times get faster, it gets harder to resolve the tiny voltage differences.

Maybe you can slam to -1. OTOH, just using 2 bits instead of one... trit(?) seems easier.

Same reason the “close window” button is in the corner. Hitting a particular spot requires precision in 1 or 2 dimensions. Smacking into the boundary is easy.

  • hinkley 2 hours ago

    The lower voltage helps reduce leakage and capacitance in the chip as the wires get closer together.

    But it does argue against more states due to the benefits of just making 1 smaller if you can and packing things closer. Though maybe we are hitting the bottom with Dennard scaling being dead. Maybe we increase pitch and double state on parts of the chip, and then generations are measured by bits per angstrom.

  • estimator7292 3 hours ago

    Once we invented CMOS this problem pretty much went away. You can indeed just slam the transistor open and closed.

    Well, until we scaled transistors down to the point where electrons quantum tunnel across the junction. Now they're leaky again.

russdill 2 hours ago

There's a ton of places in modern silicon where a voltage represents far more than just on or off. From the 16 levels of QLC to the various PAM technologies used by modern interconnects

  • hinkley 2 hours ago

    I’ve wondered any number of times if 4 level gates would be useful to increase cache memory in CPUs. They aren’t great for logic, but how much decoding would they need to expand an L3 cache?

  • DiggyJohnson 2 hours ago

    What is PAM in this context?

    • saxonww 2 hours ago

      Pulse amplitude modulation

      • DiggyJohnson an hour ago

        Thanks. That’s a deep rabbit hole upon initial glances to say the least

gyomu 2 hours ago

> Trinary is philosophically appealing because its ground-floor vocabulary isn’t “yes” and “no”, but rather: “yes”, “no”, and “maybe”. It’s probably a bit much to imagine that this architectural difference could cascade up through the layers of abstraction and tend to produce software with subtler, richer values … yet I do imagine it.

You can just have a struct { case yes; case no; case maybe; } data structure and pepper it throughout your code wherever you think it’d lead to subtler, richer software… sure, it’s not “at the hardware level” (whatever that means given today’s hardware abstractions) but that should let you demonstrate whatever proof of utility you want to demonstrate.

Nevermark an hour ago

Ternary is indeed an enticing, yet ultimately flawed dream.

Quaternary allows for:

  True, “Yes”

  False, “No”

  Undetermined, “Maybe”, True or False
And:

  Contradiction, “Invalid”, True and False
Many people don’t know this, but all modern computers are quaternary, with 4 quaternit bytes. We don’t just let anyone in on that. Too much power, too much footgun jeopardy, for the unwashed masses and Python “programmers”.

And the tricky thicket of web standards can’t be upgraded without introducing mayhem. But Apple’s internal-only docs reveal macOS and Swift have been fully quaternary compliant on their ARM since the M1.

On other systems you can replicate this functionality, at your own risk and effort, with two bits per. Until safe Rust support ships.

—-

It will revolutionize computing, from the foundations up, when widely supported.

Russell’s paradox in math is resolved. Given a set S = “The set of all sets that don’t contain themselves”, the truth value of “Is S in S” in quaternary logic, returns Contradiction, I.e. True and False. Making S a well formed, consistent entity, and achieving full set and logical completeness with total closure. So consistency is returned to Set theory and Russell’s quest for a unification of mathematics with just sets and logic becomes possible again. He would have been ecstatic. Gödel be damned!

Turing’s Incompleteness Theorem demonstrates that 2-valued bit machines are inherently inconsistent or incomplete.

Given a machine M, applied to the statement S = “M will say this statement is False”, or “M(S) = False”, it has to fail.

If M(S) returns True, we can see that S is actually False. If M(S) returns False, we can see that actually S is True.

But for a quaternary Machine M4 evaluating S4 = “M4(S4) = False”, M4(S4) returns Contradiction. True and False. Which indeed we can see S4 is. If it is either True or False, we know it is the other as well.

Due to the equivalence of Undecidability and the Turing Halting Problem, resolving one resolves the other. And so quaternary machines are profoundly more powerful and well characterized than binary machines. Far better suited for the hardest and deepest problems in computing.

It’s easy to see why the developers of Rust and Haskell are so adamant about getting this right.

  • nzeid an hour ago

    Not wrong, but I think the hope was more to have "8-trinit" bytes i.e. something with more states than a classic bit.

  • IndrekR 40 minutes ago

    Most common quaternary storage system is probably DNA.

  • readthenotes1 an hour ago

    I've liked true, false, unknown, unknowable--though think there should be a something somewhere for fnord.

pumplekin 3 hours ago

I've always thought we could put a bit of general purpose TCAM into general purpose computers instead of just routers and switches, and see what people can do with it.

I know (T)CAM's are used in CPU's, but I am nore thinking of the kind of research being done with TCAM's in SSD like products, so maybe we will get there some day.

  • hinkley 2 hours ago

    There’s a lot of tech in signaling that doesn’t end up on CPUs and I’ve often wondered why.

    Some of it is ending up in power circuitry.

  • cyberax an hour ago

    TCAM still uses 2-bit binary storage internally, it just ignores one of the values.

1970-01-01 42 minutes ago

Isn't quantum computing "all the aries"

The quantum dream is also the trinary dream.

jacobmarble 3 hours ago

In digital circuits there’s “high”, “low”, and “high impedance”.

  • gblargg 2 hours ago

    There's low-impedance and high-impedance. Within low-impedance, there's high and low.

ChrisMarshallNY 2 hours ago

I seem to remember reading about "fuzzy logic" (a now-quaint term), where a trinary state was useful.

  • zer00eyz an hour ago

    "One feature that sets certain rice cookers above the rest is “fuzzy logic,” or the ability of an onboard computer to detect how quickly the rice is cooking or to what level doneness it has reached, then make real time adjustments to time and temperature accordingly. " ... From: https://www.bonappetit.com/story/zojirushi-rice-cooker

    It is a term that is still quite a fair bit for marketing. I think in this case (zojirushi) it isn't trinary, rather some probalistic/baysian system to derive a boolean from a number of factors (time, temp, and so on).

anon291 2 hours ago

Mapping the three trinary values to yes no and maybe is semantic rubbish

DiggyJohnson 2 hours ago

This is off topic but how do you build and post to that blog? Homegrown or framework?

marshfram 3 hours ago

Analog is next. Software first, then build the machines. No more models, reductions, loss. Direct perception through measurement and differences.

  • bastawhiz an hour ago

    We'd need a real breakthrough in physics to have such a technology that works at a scale even remotely comparable to what a low end digital CPU can do today. The thing is, there's not even any real evidence (at least to my knowledge) that there are useful threads that researchers know to pull on that could yield such a technology. Emulating analog hardware with digital hardware in anticipation of some kind of breakthrough isn't going to have any material benefits in the short to medium term.

  • cluckindan 3 hours ago

    Analog was before, though. General computing was never realized using those architectures; granted, they were mechanical in nature, so that is a big ask, both figuratively and literally.

    Maybe we could create continuous-valued electrical computers, but at least state, stability and error detection are going to be giant hurdles. Also, programming GUIs from Gaussian splats sounds like fun in the negative sense.

    • marshfram 3 hours ago

      You have to withdraw from the binary in all senses to begin to imagine what an analog spatial differences measurement could function as.

      Again, think software first. The brain is always a byproduct of the processes, though it is discerned as a materialist operation.

      Think big, binary computers are toys in the gran scheme of things.