Diskussion:Artificial Intelligence System

Aus BC-Wiki
Version vom 22. Januar 2008, 13:45 Uhr von Rebirther (Diskussion | Beiträge) (Die Seite wurde neu angelegt: Over a period of 2 days we have simulated 360,000 neurons. The neurons have soma and one dendrite. The simulation is performed using Hodgkin-Huxley models with a simula...)
(Unterschied) ← Nächstältere Version | Aktuelle Version (Unterschied) | Nächstjüngere Version → (Unterschied)
Zur Navigation springen Zur Suche springen

Over a period of 2 days we have simulated 360,000 neurons. The neurons have soma and one dendrite. The simulation is performed using Hodgkin-Huxley models with a simulation time step of 5 milliseconds and for a duration of 100 milliseconds. Computations are performed in parallel for all cells at each time step. Data is stored in embedded databases.

In order to understand the significance of this experiment we should compare these values with the estimated number of neurons of the honey bee, which stands at around 850,000 neurons. The human brain has an estimated 100 billion neurons. As the project picks up steam with the launch of applications for other platforms we should be able to run increasingly large simulations.

Computing Information The neural network simulator is an application that simulates neurons. Each downloaded work unit contains 5,000 neurons. Because the simulator is in an initial phase and we have very few cellular models implemented, we can only use it to test for simulations capacity. To date the largest brain simulation has been done on a cluster of 30 machines, with 100 billion neurons simulated over a couple of months. While it was a very interesting experiment which pushed the frontier further on it was a partial simulation only, in the sense that many of the required components were not implemented due to hardware constraints.

The neurons were created, simulated and then destroyed in memory and without any data being stored. From a practical point of view it didn't advance the knowledge further on and that's why we would like to continue along this line of thought and bridge these results with some practical data. The problem of storage and computing power is esential for large scale brain simulations because without them we can't plan and estimate these requirements. Without planning there is also no clear understanding as to what is needed in order to do that. As we advance with the simulation and more and more neurons get simulated, we should be able to make increasingly precise estimations on storage, number of computers required, duration, bandwidth and other factors. Regardless of the fact that at this stage our simulation is not precise and it lacks in many aspects, this is what we want to achieve with your help.

There is also the added benefit that once we will publish these results and the public at large would see that the capacity to simulate the entire brain is considerably higher than previously thought, artificial intelligence will have a large stumbling block removed from its path.