Thursday, August 29, 2019

Blockchain


Anyone who has watched any technology channel news or followed any technology blog in the last few years has probably heard about blockchain. It has been touted as everything from the future of computing to the formation of a global currency and payment method. None of us know what the future of any technology holds, but blockchain has a very promising future.

So what is blockchain? Blockchain is a growing list of related records called blocks that are linked using cryptography. Each block contains an encrypted hash of the previous block, a timestamp and transaction data. A hash is an abbreviated version of the actual data, it perfectly represents the data, but cannot be reversed to display the data. This makes it possible for a block to verify the previous block by the matching hash, without knowing the data stored in the previous block. The timestamp works to track the transaction in time and prevent the block from being modified once written as any change to the data or the timestamp will change the hash, breaking the chain. You can think of a blockchain loosely like a notebook with a carbon copy of the previous page overlaid at the top of the current page, and a page is only a valid part of the book if it matches the previous page.

The high resistance to changing the data makes it a great tool for representing an open ledger, that can be widely distributed across multiple systems. This network of systems can each independently create new blocks in the chain linked to a previous block, and they are all shared across the network to every other system. Once a block is added to the chain, it cannot be modified without altering all the subsequent blocks in the chain. To modify the subsequent blocks, every system in the network must agree to the change. This allows very large groups of people to access the ledger and track transactions without the worry that someone else in the network can modify the data.

Blockchain was invented by an anonymous person or group of people under the penname Satoshi Nakamoto in 2008 to serve as the transactional ledger for bitcoin. This allowed bitcoin to become the first electronic currency to solve the problem of double-spending without the need for a trusted authority or central server. The first work on a secure chain of data blocks was first described in 1991 by Stuart Haber and W. Scott Stornetta. Their work was designed to prevent timestamps of documents from being modified after creation. In 1992, Bayer, Haber and Stornetta incorporated Merkle trees to the design, improving the efficiency by allowing several document certificates to be collected into a single block. Nakamoto improved their design by using a hash function to link the blocks without requiring new blocks to be signed by an authority. This modification allowed blocks to be added by any source and still be considered trusted because they could not be altered.

The words block and chain were used separately in Nakamoto’s original paper but were popularized as a single term in early 2016. What makes blockchain such a powerful tool is that it creates a digital ledger that does not require a central server, can be made entirely public and is shared across many computers. Any involved record cannot be altered retroactively, without altering all the future transactions. This allows any participant in the chain to verify and audit transactions independently and inexpensively. The mass collaboration between the peers within the system creates a robust workflow by its very nature. A blockchain cannot be independently reproduced, enforcing that each unit of value is only able to be transferred once, solving a long-standing problem of double-spending in digital currencies. 

Blockchain is another technology like cryptography that will be impacted by quantum computers, but blockchain, unlike cryptography, will be impacted in a positive manner. Quantum computing will improve the speed of blockchain transactions but will have no impact on the security of the chain.
I realize that there were many technical terms used in this article that may be unfamiliar to you, so in the coming weeks I will cover hashes and Merkle trees in more detail. For now, it is enough to state that they are methods of linking, storing and verifying data heavily used in blockchain technologies.

Thursday, August 22, 2019

Quantum Computing and what it means for you


Do you use online banking? Buy things on Amazon? Or post stories on Facebook? If so, then Quantum Computing can have a major impact on how you do things on the internet. To explain how, we must start with understanding internet security.

Have you ever noticed that some websites, like your banking site for example, start with https and show a little lock next to the address bar, while others start with http and do not have the lock? The https and the lock mean that it is a secure site and any information you enter on a form is guaranteed to only be readable by the official owners of the site. This works through a process of encryption using the secure sockets layer (ssl).

Current encryption techniques work off of a precept that factoring numbers, especially extremely large prime numbers that have been multiplied together, is an extremely hard problem for a computer to solve. Modern encryption techniques multiply two 2048-bit prime numbers together to establish an encryption scheme. The two prime numbers represent the client and server and each one knows the other prime, but no one else listening on the line knows either prime.

To factor such a large prime number will take a nearly infinite time with current computer technology. Every digit must be sampled independently, each taking two steps, and even with the most powerful computers it would take more than the current age of the universe to factor the primes. A good mathematician can factor them by hand in about a decade, by which time your bank transaction information is useless.

This is where quantum computers come into play. A quantum computer works off the concept of a qubit. A qubit can be thought of as a coin flipping in the area, we do not know whether it is heads or tails until we catch it and look. It has equal probability of being either one before it lands. This state is known in the quantum computing world as superposition. Another power of a qubit is called entanglement. Entanglement is where two qubits exactly mirror each other; regardless of what happens, they are always in either equal and opposite states or exactly the same state, depending on how they became entangled.

These two properties of qubits allow a quantum computer to know all the possible outcomes for any possible input instantaneously. As a result, they can factor very large prime numbers very quickly, in effect, breaking modern cryptography. This is bad news for your bank account, but the good news is that this requires a large, fault-tolerant quantum computer.

Today we have small, noisy, intermediate-scale quantum (NISQ) computers. NISQ systems have relatively small qubit counts of less than 50 qubits. The qubits in these systems are noisy and error prone, creating problems with successful results in things like breaking encryption, and they are too small. They can only factor up to 50-bit numbers. 

Current predictions are that we are between 10-20 years away from having a fault-tolerant quantum computer, with 2048 qubits capable of breaking modern encryption, and the banking industry is already working on quantum encryption techniques in preparation for the future. So there is no real need to worry for at least a decade.

Thursday, August 15, 2019

Next generation computing


Over the last few weeks we have discussed several computing laws that have driven computing development over the past decades. We noticed that many of them have slowed down and some of them are dead. This means that our current development trends to build faster, more efficient computers using classic methods have come to an end. So what’s next?

There are several new technologies being developed to overcome the speed barriers we have hit with classical computing. One of those technologies is Quantum Computing, which will be the focus of this week’s article.

Quantum Computing is not as new as people may believe. It was a theoretical concept in 1980 when Richard Feynman and Yuri Manin expressed the idea that a quantum computer had the ability to simulate things that a classical computer could not. A great example of this is the simulation of molecular structures. Molecules increase in their complexity exponentially as the number of electrons in the molecule increase, making it nearly impossible to simulate a large molecule entirely with today’s largest computing platforms.

So how is this possible? To understand the power of a quantum computer, first we need to understand a little about how classical computers work. So here it goes. A classical computer stores information in bits. You can think of a bit as a tiny on/off switch; it is always either on or off, and cannot get stuck somewhere in between. Each additional switch increases the amount of information a computer can process. Modern computers have a 64-bit address space. This allows them to store and process single chunks of information as large as 64 binary digits. This can represent a number large enough to count seconds for 292 billion years. That seems like a lot, but we are currently reaching the computational limits of these systems.

A quantum computer works in a very different way. It utilizes the strange properties of quantum physics to store and manipulate data. There are two properties of quantum bits or qubits that make them ultrapowerful for certain computing problems. The first is superposition, which basically means that a qubit can represent a one, a zero, or an infinite number of states in-between one and zero.  They hold a probability distribution of the likelihood that they are a one. The second is entanglement; this allows two qubits to become linked in a manner that modification to one qubit affects the other.

So what exactly does this mean? Well, a classic computer can only hold a single state, so although it can represent and compute numbers as large as 64-bits, it can only represent one number at a time. The power of superposition allows a quantum computer to hold all the states simultaneously. As an example, the first computers were 8-bit systems. They could hold values up to 256 and would hold, at most, three values simultaneously. An 8-qubit quantum computer can hold all 256 values simultaneously and do operations on all of them in a single operation. It is 256 times more powerful. But it is important to remember that a quantum computer is not really storing all the states, but rather the probabilities of each state occurring. This means it can actually give a wrong answer because it has no definite state. It runs entirely on the probability of being in a particular state.

The most powerful algorithm proven to work today on quantum computers, provided we overcome the stability issues, is Shor’s Algorithm which is designed to factor very large prime numbers. This algorithm would allow someone with access to a quantum computer to break modern cryptography schemes, making current network security impossible to maintain. Quantum Cryptography is becoming a major field of study in figuring out how to make cryptography quantum-proof before the processors become powerful enough to break the encryption. Next week we will talk a little about cryptography and what it means to you.


Thursday, August 8, 2019

The Circle of Life


Three weeks ago, I began a series on computing laws and promised I would propose a law of my own in the final article of the series. The time has come to introduce Hamilton’s Law, “The Circle of Life,” which is meant to show my observation of the cyclical nature of computing platforms and how we have completed what I believe to be the first of many cycles in computer development.
The first computers known in history were probably better classified as memory devices rather than calculation-based devices. The first known was the abacus, which was used primarily for counting and tracking large numbers without having to record them on paper. It could store a single number. The next early computer was the slide rule, which basically was a method of reducing a large table of lookup values like sine and cosine, square roots, and other hard to calculate values in a small form factor. The calculations were still primarily done by people.
In the 1930s computers were people hired to do computation on paper, usually several “computers” worked the same complex problem and the results were compared to check for accuracy. In the 1940’s we began to build very large vacuum tube-based computers the size of houses that could only handle a few bits of information (30-bits), and which could only effectively handle a single 10-digit number at a time. 
The 1950s brought about the first large scale universal computer. Universal, meaning it was not limited to simple arithmetic and logic functions but could solve much more complex problems. The Univac was a vacuum tube-based computer roughly 1000 times more powerful than the code breaking computers of the 1940s. It could handle calculation and storage of up to 1000 12-digit numbers.
The 1960s brought about the first transistor-based computers, which allowed the size to go from the size of a building to the size of refrigerator. We were still far from a portable device, or even a home computer, but we were getting closer. The first transistor-based computers were smaller and more powerful once again by about 1000 times over the prior decade. 
The 1970s brought with it a lot of exciting things for the general public. Before the 1970s only government organizations and large academic institutions had access to computers. In 1970 the internet was born, allowing these institutional computers to communicate with each other. Methods of putting multiple transistors into a single device, called an integrated circuit (IC), came into production, and computers got even smaller. By the mid 1970s you could by an IC-based desk calculator that could do everything the computers in the 1930s could do and then some. These desk calculators were able to be carried in one hand but used too much power to be portable and needed to be plugged in to operate.
The 1980s brought the first true computers into the home. It was the era of the home computer. They had exceeded the computing power of the Univac and were small enough to be placed on or under a desk. It was not until the 1990s that computers got both light enough and efficient enough to become portable. They were still the size of a brief case but affordable enough that most middle-class families could buy one if they were interested. The World Wide Web was born in the 1990s, allowing people to share information openly from their computers.
The 2000s were the beginning of the portable computing era. Computers were finally small enough and efficient enough to be carried in one hand, or even a pocket. They were battery powered and could last a few hours without a charge. Wireless networks were coming about, allowing us to utilize the web without being connected to a wire. Yet there was more to come.
The 2010s have been the era of the ultraportable computers. The iPhone, tablets, Apple watches, Fit-bits, and other wearable computers came out. I remember in the early ‘80s talking about how some day we would have computers everywhere, but I never imagined computers fitting in a watch. 
Looking to the future is where we see the cycle begin again; over the last few years technology to develop quantum computers has taken hold and the 2020s will be the year of the quantum computer. The leading quantum computer has 30-qubits, exactly the same as the 30-bit 1940s system. It also weighs nearly the same, and takes nearly the same amount of space. We are in hopes that the quantum computing cycle will move faster than the digital cycle discussed above, or we will be waiting until the year 2100 for portable quantum computers. We have traveled full circle back to early technology; granted these quantum computing systems are infinitely more capable than the current digital systems, just as the current digital systems are infinitely more capable than the early analog systems. We are just beginning with the technology to build them. Look forward to next week when I talk about quantum computing and what it means for you.

Thursday, August 1, 2019

Computing Laws Part 2


Last week we touched on a few of the laws governing algorithm performance on computers. These laws talked a lot about the nature of computers, how they work and communicate with each other, and the impact that this interaction has on the performance of software. This week we are going to talk about a few of the laws that govern what is possible in the development of computer hardware.
The most well-known of these laws is Moore’s Law, which recently reached an end of its usefulness. In 1965, Gordon Moore, the co-founder and CEO of Intel, observed that the density of transistors on an integrated circuit doubled about every two years. There are several arguments that the usefulness of this in regards to computational power ended in 2001, but Intel as a leader in the processor industry argues that it is still in effect and improving computing hardware today. For a couple of reasons, I am in the group that believes Moore’s Law died in 2001. The first is that in 2001, we produced the fastest single core processor every created, and we have not gotten any faster. In fact, the single core speed is now about half as fast as what it was in 2001. The second is that starting in 2001, the speed benefit of smaller transistors went away; as they got smaller they also got slower, but Intel as well as other processor manufactures started putting more cores on a single chip. We now have processors with 40 or more cores in a chip, but the speed is around half the speed of the single core chips of the 1990s. It has been agreed that Moore’s Law no longer applies as transistors have reached sizes nearing the size of a single atom and can no longer get smaller, so Moore’s Law is in effect dead, but still worthy of mention.
Koomey’s Law is very similar to Moore’s Law but is related to the energy consumption of a processor. Jonathan Koomey observed that as the transistors got smaller, they were getting more energy efficient at an average rate of doubling in efficiency every 1.5 years, but the efficiency has fallen off as Moore’s law slowed, resulting in a current doubling of efficiency only every 2.6 years.
Dennard’s Law, referred to as Dennard Scaling, was an observation in 1974 by Robert H. Dennard and his colleagues. He was attempting to provide an explanation as to how processor manufacturers were able to increase the clock frequency, and thus the speed of the processor, without significantly increasing the power consumption. Dennard basically discovered that, “As transistors get smaller, their power density stays constant, so that the power use stays in proportion with area; both voltage and current scale downward with the length of the transistor.” Dennard scaling broke down in the 2005-2006 era as it was ignoring some key factors to the overall performance of the transistor. These factors include the leakage current, which is the amount of current loss across the gate of the transistor, and the threshold voltage, which is the minimum voltage necessary to open the transistor gate. This established a minimum power consumption per transistor, regardless of the size.
Rock’s Law is an economic factor related to the cost of processor manufacturing, and probably one of the main reasons the scaling of Moore’s law was over years and not months. Arthur Rock, an investor in many early processor companies, observed that the cost of semiconductor fabrication plants doubles every four years. Rock’s Law is also known as Moore’s second law and is the economic flip side of Moore’s Law.
The last law I will talk about on computer processor design is Bell’s Law. In 1972, Gordon Bell observed that over time, low cost, general purpose computing architectures evolve, become mainstream, and eventually die out. You can see this from your cell phone; most people replace their cell phones every two years. However, his law was dealing with larger scale systems. For example, roughly every decade a new class of computers results in new usage and establishes new industry - the 1960s mainframes, 1970s minicomputers, 1980s personal computers, 1990s worldwide web, 2000s cloud computing, and 2010s handheld devices and wireless networks. Predictions indicate that the 2020s will be the decade of quantum computing.

Death of an Interconnect

I was interviewed yesterday about my reaction to the news that Intel is discontinuing development of the next generation of OmniPath.  I was shocked to hear that they canceled the project, but like many others over the years, they lost to market leader Mellanox. Every company I have seen attempt to compete in the high speed interconnect market has folded within 5-years, so I was not surprised.  To read the full article including my comments you can check it out here.
Intel Kills 2nd-Gen Omni-Path Interconnect For HPC, AI Workloads