Sunday, November 24, 2013

Artificial Intelligence: A Brief Rundown

    Machine learning is a critical part of Artificial Intelligence as we know it. Much of modern AI schemes are developed to statistically gather information on a matter and build on the information received. There are many methods, and some more effective for certain tasks then others. To some, AI seems like an exotic concept, far on the horizon. When I told a friend I was taking a class on AI systems, his first question was "Are you building Skynet?" AI is a reoccurring theme in popular science fiction literature and film, and as such has developed a bit of misconception. Scientists aren't trying to end the world with a super computer run wild, and really aren't even close. However, there are important developments in the field that have allowed for significant improvements in management of big data and regulatory systems through concepts such as machine learning, or Weak AI.

    The field of AI research has been around for a long time, having been founded in 1956. The holy grail of AI was then, and is still now, to develop a system that could emulate the human brain, and be capable of reasoning on the same level as a human being. Researcher's initial, and in hindsight optimistic, predictions stated that this machine should exist in less then a generation. The prospect of machines being able to do all of our most dangerous or menial jobs was enchanting, and garnered massive government funding for the time. Researchers had completely underestimated the difficulty of their task, however, and by 1973 funding for the projects had stopped almost completely. Research continued, despite the setbacks of what was known as the AI Winter.
Today, you could say we have come out of the winter and into the spring of AI research. Problems that were deemed unsolvable in the heyday of AI in the 60's have been solved, and applied to various technologies. Milestones include autonomous cars, chess programs, and more recently IBM's Watson.

    AI Research has fragmented into a variety of sub fields, focusing on individual problems and concepts. Machine learning is one of these, and has shown to be the most practical field. It is the study of computer algorithms for learning to do things. Data is observed, and the algorithm tries to learn based off this data to do better in a task in the future. The data may be fed manually, but the entire decision making process is emphasized as automatic. There are limits to what problems can be 'learned' in this manner, and the easiest tends be classification problems. These include character recognition, facial recognition, language understanding, spam filtering, and fraud detection.
 
There are some philosophical issues dealing with AI development, and our society as a whole needs to face these as we approach the holy grail. They are tough questions, and I will readily say I'm not qualified to tackle them! I just know if we make some Strong AI with a godly intellect, I would ask that it be restricted access from our nuclear stockpile.







Sources:
http://www.i-programmer.info/babbages-bag/297-artificial-intelligence.html
http://www.cs.princeton.edu/courses/archive/spr08/cos511/scribe_notes/0204.pdf
http://library.thinkquest.org/2705/history.html
http://aitopics.org/misc/brief-history

Sunday, November 17, 2013

History of Computer Science : Some Technologies

The history of Computer Science has been marked by major innovations in technology. Personally I've lived through the proliferation of Personal Computers, Smart Phones, and gaming consoles. This has all happened within the last 30 years, and marks my experience with technologies brought to consumers through Computer Science. The origins of computers and Computer Science dates back much further. I will give a brief rundown of the various tech that accompanied these advances in Computer Science.


Early Mechanical Calculators

The simplest devices can aid in your calculations, and people have been seeking these aids for over 5 Millennia. The earliest record of an Abacus dates back to 3000 B.C., and has seen an evolution throughout the history of mankind. Just as we seek to advance our technologies today, people have been doing so through the ages. Pictured is a modern version of the abacus, a large step from the ancient counting boards from which it originated. Other mechanical calculators exist is well, such as the slide rule. One which is very interesting is the The Antikythera Mechanism, and worth looking up.

Punched Cards



At the dawn of the 19th century, Joseph-Marie Jacquard created a loom where the pattern to be woven was set by a punch card. He could change the design of the weave without actually changing the design of the loom, marking a significant advancement in automation. It was 30 years later that pioneer Charles Babbage would use these punched cards as a means to store his programs for his Analytical Engine. Later in the century these cards would become the standard for data storage. Machines were designed to process the cards, and using these machines the 1890 US Census was accomplished months ahead of schedule, and far under budget.

The Analytical Engine and Charles Babbage

Charles Babbage is widely renown as the father of computing. His work spanned the first half of the 19th century, and produced some incredible results. He designed a calculator capable of computing numbers with up to eight decimal points, and made plans for a machine that would become the foundation for all modern computers. This was the Analytical Engine, and is considered the first computer. He envisioned a massive brass and steel steam powered computing machine, but the principals were the same. In this machine he had plans to use an ALU, control flow and integrated memory. It was only when electric computers were designed that engineers realized that Charles Babbage had preempted features of their designs by nearly a century.

The Mechanical Office

Before computers, around the 1920's, the office was dominated by three machines, the typewriter, the filing system, and an automated adding machine. Today, all of the functions provided by these devices are still necessary, but have been thoroughly replaced by the computer. The supply of these three devices was controlled by four major companies, Remington Rand, National Cash Register, Burroughs Adding Machine Company, and IBM. It was only when IBM had embraced the new technology of computers that it leapt ahead of the competition. The use of these machines allowed businesses to operate more efficiently, and accurately.

The First Commercial Computers

Harvard Mark I:The Harvard Mark I was an automatic digital computer. It could perform the four basic arithmetic functions, and had subroutines that allowed it to handle logarithms and trigonometric functions. It was one of a kind, and preceded the stored program computers that would become the norm. The EDSAC was the first practical stored program computer. The programs were stored on punched paper tapes.







In 1951, the first UNIVAC was given to the US Census Bureau. This marked the beginning of commercial sales of stored-program computers in the US. The UNIVAC was able to serve multiple purposes, compared to the earlier computers which were made for a singular purpose and owned by their manufacturers.

HP-35

Fast forward to 1972, Hewlett-Packard releases it's HP-35, HP's first pocket calculator. It was the world's first scientific pocket calculator, having trigonometric and exponential functions. It used RPN, could store intermediate solutions into it's solid state memory, and could display entries in scientific notation. It marked a point where computational tools were getting smaller and more advanced, a trend that would continue.






The Xerox Star

The Xerox Star was an early PC built by Xerox in 1941. It was the first computer to integrate a graphical user interface through a monitor, controlled by a mouse and keyboard. It wasn't sold commercially, much to Xerox's dismay today. It greatly influenced the design of the personal computers that would be released in the coming decades.
A video of it in action is here, as through some research I realized I'd never seen the mythical Xerox GUI. For anyone curious, this is a pretty interesting sight!

Xerox Star User Interface (1982) 1 of 2 - YouTube





The Apple Lisa and Personal Computers

From here on we know the story, Apple continued development on the personal computer using a GUI. We see competition, and in the 80's PC' are becoming more common. By the end of the 90's we are hard pressed to find classrooms without PC's, let alone homes. Smart phones are right around the corner, and we are surrounded by computers, whether we would like it or not.


The history of Computing and Computer Science is really interesting, and I enjoyed doing some extra research to read about the figures and machines that made it possible. Personally I'm more interested in the machines, as my own history with computers deals directly with the technologies produced as a result of said Computer Science. I remember my first computer, I'd have to sort through unlabeled 5 1/4" floppy disks just to see what was on them. I was mostly curious as to what could be on these things, more often I was just looking for a game. I hit the constraints of my computer when I found Joust VGA, which I couldn't play because well... I didn't have the appropriate VGA hardware. Being a 6 year old, I had no idea what VGA was, or why it wasn't letting me play this game. I tried my best to learn and understand what these things were, and as a result ignited my passion for technologies and computing. The rest, well, it's history.

Resources :
http://en.wikipedia.org/wiki/History_of_computer_science
http://en.wikipedia.org/wiki/History_of_computing_hardware
http://www.computerhistory.org/
http://www.eingang.org/Lecture/
http://campus.udayton.edu/~hume/Computers/comp2.htm