Icon Unrolling Rotations

 

Icon Animation Blend Spaces without Triangulation

 

Icon Quaternion Weighted Average

 

Icon BVHView

 

Icon Dead Blending Node in Unreal Engine

 

Icon Propagating Velocities through Animation Systems

 

Icon Cubic Interpolation of Quaternions

 

Icon Dead Blending

 

Icon Perfect Tracking with Springs

 

Icon Creating Looping Animations from Motion Capture

 

Icon My Favourite Things

 

Icon Inertialization Transition Cost

 

Icon Scalar Velocity

 

Icon Tags, Ranges and Masks

 

Icon Fitting Code Driven Displacement

 

Icon atoi and Trillions of Whales

 

Icon SuperTrack: Motion Tracking for Physically Simulated Characters using Supervised Learning

 

Icon Joint Limits

 

Icon Code vs Data Driven Displacement

 

Icon Exponential Map, Angle Axis, and Angular Velocity

 

Icon Encoding Events for Neural Networks

 

Icon Visualizing Rotation Spaces

 

Icon Spring-It-On: The Game Developer's Spring-Roll-Call

 

Icon Interviewing Advice from the Other Side of the Table

 

Icon Saguaro

 

Icon Learned Motion Matching

 

Icon Why Can't I Reproduce Their Results?

 

Icon Latinendian vs Arabendian

 

Icon Machine Learning, Kolmogorov Complexity, and Squishy Bunnies

 

Icon Subspace Neural Physics: Fast Data-Driven Interactive Simulation

 

Icon Software for Rent

 

Icon Naraleian Caterpillars

 

Icon The Scientific Method is a Virus

 

Icon Local Minima, Saddle Points, and Plateaus

 

Icon Robust Solving of Optical Motion Capture Data by Denoising

 

Icon Simple Concurrency in Python

 

Icon The Software Thief

 

Icon ASCII : A Love Letter

 

Icon My Neural Network isn't working! What should I do?

 

Icon Phase-Functioned Neural Networks for Character Control

 

Icon 17 Line Markov Chain

 

Icon 14 Character Random Number Generator

 

Icon Simple Two Joint IK

 

Icon Generating Icons with Pixel Sorting

 

Icon Neural Network Ambient Occlusion

 

Icon Three Short Stories about the East Coast Main Line

 

Icon The New Alphabet

 

Icon "The Color Munifni Exists"

 

Icon A Deep Learning Framework For Character Motion Synthesis and Editing

 

Icon The Halting Problem and The Moral Arbitrator

 

Icon The Witness

 

Icon Four Seasons Crisp Omelette

 

Icon At the Bottom of the Elevator

 

Icon Tracing Functions in Python

 

Icon Still Things and Moving Things

 

Icon water.cpp

 

Icon Making Poetry in Piet

 

Icon Learning Motion Manifolds with Convolutional Autoencoders

 

Icon Learning an Inverse Rig Mapping for Character Animation

 

Icon Infinity Doesn't Exist

 

Icon Polyconf

 

Icon Raleigh

 

Icon The Skagerrak

 

Icon Printing a Stack Trace with MinGW

 

Icon The Border Pines

 

Icon You could have invented Parser Combinators

 

Icon Ready for the Fight

 

Icon Earthbound

 

Icon Turing Drawings

 

Icon Lost Child Announcement

 

Icon Shelter

 

Icon Data Science, how hard can it be?

 

Icon Denki Furo

 

Icon In Defence of the Unitype

 

Icon Maya Velocity Node

 

Icon Sandy Denny

 

Icon What type of Machine is the C Preprocessor?

 

Icon Which AI is more human?

 

Icon Gone Home

 

Icon Thoughts on Japan

 

Icon Can Computers Think?

 

Icon Counting Sheep & Infinity

 

Icon How Nature Builds Computers

 

Icon Painkillers

 

Icon Correct Box Sphere Intersection

 

Icon Avoiding Shader Conditionals

 

Icon Writing Portable OpenGL

 

Icon The Only Cable Car in Ireland

 

Icon Is the C Preprocessor Turing Complete?

 

Icon The aesthetics of code

 

Icon Issues with SDL on iOS and Android

 

Icon How I learned to stop worrying and love statistics

 

Icon PyMark

 

Icon AutoC Tools

 

Icon Scripting xNormal with Python

 

Icon Six Myths About Ray Tracing

 

Icon The Web Giants Will Fall

 

Icon PyAutoC

 

Icon The Pirate Song

 

Icon Dear Esther

 

Icon Unsharp Anti Aliasing

 

Icon The First Boy

 

Icon Parallel programming isn't hard, optimisation is.

 

Icon Skyrim

 

Icon Recognizing a language is solving a problem

 

Icon Could an animal learn to program?

 

Icon RAGE

 

Icon Pure Depth SSAO

 

Icon Synchronized in Python

 

Icon 3d Printing

 

Icon Real Time Graphics is Virtual Reality

 

Icon Painting Style Renderer

 

Icon A very hard problem

 

Icon Indie Development vs Modding

 

Icon Corange

 

Icon 3ds Max PLY Exporter

 

Icon A Case for the Technical Artist

 

Icon Enums

 

Icon Scorpions have won evolution

 

Icon Dirt and Ashes

 

Icon Lazy Python

 

Icon Subdivision Modelling

 

Icon The Owl

 

Icon Mouse Traps

 

Icon Updated Art Reel

 

Icon Tech Reel

 

Icon Graphics Aren't the Enemy

 

Icon On Being A Games Artist

 

Icon The Bluebird

 

Icon Everything2

 

Icon Duck Engine

 

Icon Boarding Preview

 

Icon Sailing Preview

 

Icon Exodus Village Flyover

 

Icon Art Reel

 

Icon LOL I DREW THIS DRAGON

 

Icon One Cat Just Leads To Another

A problem that takes the age of the universe to calculate and can be done on your home computer

Created on June 28, 2011, 10:29 p.m.

In the theory of computation we get taught a kind of fallacy that has to be considered when reasoning about the growing power of computers. For some problems, as the input size grows, the time taken to calculate an answer grows far far faster - meaning that with a non-trivial input size, calculations can be predicted to take many times over the current age of the universe.

(Conversely, it is also interesting to note that this kind of thing can work in the opposite direction. We've calculated pi to several trillion decimal places, but pi calculated to 39 decimal places is sufficient to estimate the circumference of any circle that fits in the observable universe, with the same accuracy of the radius of a hydrogen atom.)

I was talking to a friend about this and made what seemed, at the time, a fairly reasonable claim -

 

"I could fairly easy conceive of and write a program for my home laptop that would take longer than the length of the universe to calculate."


Thinking back to the conversation, I realized this was a bit more of an elaborate claim than I had first though. I began to think about how I might actually do this.

My initial knee-jerk reaction was something like a program that naively loops over the numbers between zero and a bazillion and adds them together. This would be sure to take a long, long time providing I picked a range big enough. But there are several issues with this. First of all, we have a solution to the sum of numbers between n and m, namely the sum of an arithmetic sequence, and it runs in a set time no matter how many numbers we want to sum. Secondly, I figured I'd probably run out of RAM before the universe ended.

It was clear that whatever problem I picked had to be a difficult problem to solve (geek: run in exponential time, be NP-Complete or even better NP-Hard), had to be in some way incremental so as to not blow up my RAM. As well as this it still had to be explainable and comprehensible to the average person.

I began to consider some of the usual areas in which hard problems crop up and in which people tend to understand. Prime numbers are always a good thing to look at, as they are easy to understand, even if some theories about them are complicated. Another area that looked promising was in cryptography, but computers have gained so much in power these days that powerful cryptographic algorithms tend to be really quite complicated to explain. To save my RAM from blowing up the problem had to be in some way incremental, or a "best fit" problem.

I considered some dense, high dimensional search space problems of my own invention, but most of the things I could come up with didn't seem that natural, and wouldn't really be understood by the layman. As usual, doing some research on the internet proved a better bet and I found the perfect example.

 

The Traveling Salesman problem

This is a well known, and well studied problem; notorious for being hard to solve. It goes like this -

 

Given a list of towns, and a list of distances between the towns, what is the shortest route between the towns?


It turns out there is no real technique to solving this problem rather than trying every combination of routes. There are various techniques for finding a route that is either very good, or almost optimal, but nothing that will confirm the correct, shortest path without simply trying all combinations. Due to the nature of the problem, the number of combinations that make up paths grows extremely fast with the number of initial towns. In fact we can be more specific and say that for n towns, the number of combinations that need to be checked is proportional to n factorial or n!.

The reason for this is that we have to construct paths of length n and for each point in the path, we need to consider a route continuing out to every other town not already visited. If you imagine being at a connection m in a path (having visited m towns already), we have to consider paths which continue out to n-m towns (all the unvisited ones), then for each of these towns we have to repeat the process. Starting routes from each of the initial towns, using this process, resulting in a number for the total number of paths that looks something like this: n * (n-1) * (n-2) * (n-3) ... Which is exactly n!

Because of how massive n! gets when n is even slightly large, this means that within about 25 towns, the solution to this problem would already take far, far longer than the age of the universe to compute. We can demonstrate this with some simple maths:

  • We can calculate 25 factorial, and find we have 15,511,210,043,330,985,984,000,000 different possible paths between the towns.
  • We can imagine that checking the distance of a route would probably take several hundred cycles of cpu time
  • So we get a total number of calculations for the problem that is around 1,500,000,000,000,000,000,000,000,000 cycles
  • Modern processors can run at about 3 GHZ, which is roughly 3,000,000,000 cycles per second
  • This gives us a total running time for the calculation of around 500,000,000,000,000,000 seconds
  • The current age of the universe is estimated to be about 14 billion years.
  • This is equal to 441,796,964,000,000,000 seconds Which for arguments sake we will round to 450,000,000,000,000,000 seconds
  • As you can see, 500,000,000,000,000,000 is bigger than 450,000,000,000,000,000

This means that if you had started your calculation at the beginning of the universe, at this point in time, it would be finishing up in about another billion years.

Even if we had a computer that was a million times more powerful (according to moores law, computing power is doubling every two years, so this might be a while) it would still take some fraction of the length of the universe to find the answer, which would be far beyond the span of a lifetime. Even if we had a computer several trillion times more powerful we would struggle with 30 towns, and calculating for 100 towns would more or less always be out of our grasp.

The other surprising thing about this problem is that it isn't one of a kind, there are many similar problems (a whole class of them), but in reality most of the time we don't need the optimal solution, and can use an alternative, less perfect solution. These are often far easier to calculate.

 

Note: Some people have pointed out to me the existence of an algorithm that manages to solve the Travelling Salesman problem in a faster time by eliminating groups of cities that can never render the shortest path. This runs in exponential time and is called the Held–Karp algorithm. Unfortunately this is beyond my maths to explain, and even with this algorithm the problem grows fast enough for the main point of the article to remain, we're just looking at hundreds of cities, rather than tens.


Sources

http://en.wikipedia.org/wiki/Pi
http://www.lycos.com/info/traveling-salesman-problem.html
http://en.wikipedia.org/wiki/Travelling_salesman_problem
github twitter rss