Skip to main content



The Ancient Greeks Love Neural Networks

Networks are a crucial part of current research in computer technology.  Perhaps one of the most important ways networks are increasingly being used in computers is through the concept of a neural network.  According to this video, a neural network consists of an input layer, many hidden layers, and an output layer, and is used by the computer to predict what the input is.  This input is frequently an image that is represented by numbers, and those numbers are manipulated using some mathematical functions to output a probability of the image being of a specific type.  Researchers have perfected these neural networks to the point where they may be better than humans.

An example of this occurred very recently. A group of researchers studying ancient Greek engravings used a neural network that was trained in parsing through the engravings, which were usually damaged, and asked the AI to figure out what the missing words were.  This was achieved by having the AI learn patterns in existing relics.  The AI proved to be much more accurate than the researchers at figuring out which words were supposed to be present. The researchers believe they can use the AI in a collaborative effort in order to fully reconstruct the engravings.

I think this is a particularly interesting application of networks. Neural networks are at the forefront of technological advancement, pushing what humanity can do to the limits.  I also appreciate the almost sort of simplicity to them.  Despite what may seem like a super intimidating concept, the same concepts from graph theory still apply, as in one can still analyze the relationship between the nodes and edges, and how various nodes interact with one other, largely through math.  It’s also interesting to look at the structure.  Neural networks are examples of directed graphs, traveling from the input layer to the output layer.  However, there no sense of a center, unlike many of the social networks studied in the class previously. Instead of radiating outward from the center, neural networks follow a pattern that is more sequential.  There is also the notion of positive and negative relationships, as different edges have signed weights based on certain conditions within the hidden layers.  Even though the bulk of the network is constantly changing and optimizing, the basics of graph theory still apply, which is really cool to think about.

 

Source used: https://www.newscientist.com/article/2220438-deepmind-ai-beats-humans-at-deciphering-damaged-ancient-greek-tablets/

Comments

Leave a Reply

Blogging Calendar

October 2019
M T W T F S S
 123456
78910111213
14151617181920
21222324252627
28293031  

Archives