How has the recent turmoil within the OpenAI offices changed your plans to use GPT in a business process or product in 2024?
Increased uncertainty means we are more likely to evaluate alternative AI chatbots and LLMs.
No change in plans, though we will keep an eye on the situation.
With Sam Altman back in charge, we are more likely to go all-in with GPT and LLMs.
What recent turmoil?

MIT Researchers Explore Analog Computing with Cell-inspired Circuits

Jul 17th, 2016 9:22am by
Featued image for: MIT Researchers Explore Analog Computing with Cell-inspired Circuits

Our idea of modern computing is closely associated with digital computers. But the binary system of 1’s and 0’s that digital computers depend upon are a poor way to accurately model the infinite continuum of complexity of the biological systems found in living cells and organisms.

Here’s where analog computers could become indispensable in the emerging field of bioinformatics, where scientists and engineers are developing methods and software tools for better understanding biological data.

In contrast to digital computers that represent data in separate, discrete values, analog computers are capable of representing data as a variable cross a continuous range of values, making them much more effective in situations that involve interdependent parameters that are continuously changing in real time, such as those in biological systems.

“‘Digital’ is almost synonymous with ‘computer’ today, but that’s actually kind of a shame” — Adrian Sampson.

But most of today’s analog computers are manually programmed, a laborious process that may now be replaced by something a bit easier, thanks to MIT CSAIL graduate student Sara Achour, who teamed up with advisor and MIT professor Martin Rinard and Dartmouth professor Rahul Sarpeshkar to create Arco, a compiler that takes the high-level instructions written in a language that’s understandable to us humans, and translates these into low-level circuit specifications that an analog computer can use.

The word “analog” may sound like a throwback to some dark, pre-digital age, and perhaps that characterization is somewhat accurate: rudimentary versions of analog computers have been around for at least a couple of millennia, ranging from the famed Antikythera mechanism to the slide rules that are still in use today. But some believe that the future of computing will be analog, as today’s ever-shrinking microchips will not provide enough computing power for tomorrow’s complex applications in scientific and medical research.

Cell-inspired Analog Circuits

The team’s paper was recently presented at the Association for Computing Machinery’s conference on Programming Language Design and Implementation. In it, they explain how Arco takes differential equations, which are often used to describe the complex dynamics of cell biology, and converts them into actual voltages that flow across an analog circuit. The compiler could work with any programmable analog device, but in particular, the team focused on an analog chip that Sarpeshkar is already developing that utilizes differential equations and circuits that model common cellular systems.

The analog approach performs markedly better than digital tools. In the team’s experiments, the compiler was tested using five sets of differential equations most relevant to biological research. Response times ranged anywhere from less than a minute to solve the simplest set to almost an hour to get through the most elaborate set, with 75 differential equations — which is actually pretty good compared to designing an analog implementation by hand or by digital means.

“With a few transistors, cytomorphic [“cell-inspired”] analog circuits can solve complicated differential equations — including the effects of noise — that would take millions of digital transistors and millions of digital clock cycles,” says Sarpeshkar in MIT’s press release.

Moreover, this technique could be scaled up to tackle larger problems, such as modeling a whole organ or even an organism, instead of just a cell.

“‘Digital’ is almost synonymous with ‘computer’ today, but that’s actually kind of a shame,” said Adrian Sampson, an assistant professor of computer science at Cornell University who was acknowledged for his help in the team’s paper. “Everybody knows that analog hardware can be incredibly efficient — if we could use it productively. This paper is the most promising compiler work I can remember that could let mere mortals program analog computers. The clever thing they did is to target a kind of problem where analog computing is already known to be a good match — biological simulations — and build a compiler specialized for that case.”

So what does that mean for us? There seems to be an analog renaissance afoot, and as analog computing continues to evolve in unknown directions in the years to come, we may see it underpin unexpected developments in artificial intelligence, analog computer-aided drug design and a new era in biological research, to name a few. The future is fast approaching, and it may very well be analog, not digital.

Feature Image: Wolfgang Stief (CC BY 2.0)

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.