Computing technology has come incredibly far in the last 80 years. No other technology has seen such dramatic improvements in as little time. Super computers today are millions of times more powerful then those 50 years ago, and are only getting stronger. With this computing power we have the tools to investigate and potentially solve various scientific problems. Enter computational science, a relatively new field that is changing how scientists approach and solve problems.
Computational science utilizes the advances in computational processing power to tackle problems that were far too complex mathematically. Various factors added to this mathematical complexity, such as the number of variables, the number of calculations, or the complexity of the model. Using computers we can build models to help better understand things that were impossible to compute in real time otherwise, such as weather or geological models. Essentially what Computational Science comes down to is using computers to aid with and solve complex problems. It's often considered the fourth method of conducting research, in addition to experimental science, theoretical science and observational science.
A link I found in researching this topic is to the SciDAC Review, in which they list the top breakthroughs in Computational Science. The list was incredible, and helped me understand just how tremendous the contribution of Computational Science has been to research. I urge you all to take a look, as I'm still reading through this article in awe. Here's the link.
Sources :
http://www.shodor.org/chemviz/overview/compsci.html
http://www.scidacreview.org/0901/html/bt.html
Krzysztof Zajac CS100w
My blog as I pursue work and greater technical prowess in the field of Computer Science. I'm an undergraduate student at San Jose State University, and hungry for experience in any field of work involving technology. IT, programming, engineering, I'm ready!
Sunday, December 15, 2013
Sunday, December 8, 2013
Computer Graphics : Games and Names
Computer graphics are an interesting topic to me. As an avid gamer for the majority of my life, I've come to appreciate the advance of computer graphics and graphics processing over the last 20 years.
My first experience with which was the original NES, and it's Picture Processing Unit, the Ricoh 2A03. This 8-bit microprocessor delivered the computing power necessary to process the graphics enjoyed by millions. The chip itself was optimized to use very little memory to store data pertaining to graphics. This small amount of memory could be extended through the use of mappers found on the game cartridges, quite the ingenious design to extend the lifetime of Nintendo's proprietary hardware.
Similar design and hardware was found on the later generations of consoles, but I would rather move on to the advance of PC hardware. The company 'id' is best known for the controversy surrounding their game titles, "DooM" and "Quake." The games themselves play well, and pushed the envelope on what type of gross artwork they could fit into their games. The company gains little credit for it's efforts in advancing the standard of graphics in game engines. DooM was groundbreaking, Quake was groundbreaking, and Quake 3, also, was groundbreaking. Each of these games advanced the standard for what people would expect from their games. DooM brought to life sprites rendered in a 3D space. Quake realized a full 3D world with 3D models and particles, and Quake 3 utilized new technologies and hardware like OpenGL and real GPU's.
This is all thanks to John Carmack, co-founder of id Software. I've known his name since I was a child, but it wasn't until I pursued my degree in Computer Science that I realized how brilliant of a man he is. He's invented several computer graphics algorithms, took advantage of advances in PC hardware to make his games shine, and has made major innovations in 3D graphics. The guy is spoken of entirely in awe, and I'm starting to understand why. He's made games what they are today, and without him titles like Call of Duty and Battlefield wouldn't exist. He's the father of modern gaming, and more so of modern graphics in games.
My first experience with which was the original NES, and it's Picture Processing Unit, the Ricoh 2A03. This 8-bit microprocessor delivered the computing power necessary to process the graphics enjoyed by millions. The chip itself was optimized to use very little memory to store data pertaining to graphics. This small amount of memory could be extended through the use of mappers found on the game cartridges, quite the ingenious design to extend the lifetime of Nintendo's proprietary hardware.
This is all thanks to John Carmack, co-founder of id Software. I've known his name since I was a child, but it wasn't until I pursued my degree in Computer Science that I realized how brilliant of a man he is. He's invented several computer graphics algorithms, took advantage of advances in PC hardware to make his games shine, and has made major innovations in 3D graphics. The guy is spoken of entirely in awe, and I'm starting to understand why. He's made games what they are today, and without him titles like Call of Duty and Battlefield wouldn't exist. He's the father of modern gaming, and more so of modern graphics in games.
Sunday, December 1, 2013
Computer Security : Cryptography and Protocols
Information security is an intergral part of of your daily life on the internet, whether you know it or not. Information transmitted via wireless and cable is encrypted, and follows security protocols. Without these security precautions information you transmit would be readily available to anyone observing your internet traffic. Common encryption methods include public key encryption and stream encryption. Common protocols include WPA, WEP, and others you would use every day.
Cryptography has a prolific history, spanning most of ancient time. The field of modern cryptography and cryptanalysis is young compared to other fields, as it's birth paralleled the development of computing technology. Throughout history people have needed to disguise their information so that it could travel safely. Roman's used the Caesar cipher, in which each letter of a message was changed for the letter three spaces ahead. In this manner, A became C, and so on for every letter. Various other methods existed spanning different civilizations, and you can take a look at this site I found, some of which are very interesting and worth checking out.
The reason I described cryptography's history in such endearing terms is because the contributions of allied code breakers to the effort in WWII, by some accounts , may have won them the war. As a result, cryptography and cryptoanlysis were held in high esteem in post WWII times, and became funded and regarded as legitimate fields. The NSA was established to further research cryptography and cryptanalysis method in the United States, and in post WWII times was the primary establishment for these fields.
Today computer security is vital. Encryption methods developed by brilliant minds such as Adi Shamir are used in WEP, the most commonly used encryption method for wireless networks. The interest in security comes from both sides of the coin, as for every defense built, there is an attacker in mind. People are constantly exploring encryption methods and security protocols to find vulnerabilities, either to be exploited or for academic purposes. The class I took recently on information security was very well rounded, and I owe a lot of my interest in the subject to my teachers, Tom Austin and Mark Stampp. Their approach to the subject explored both the attackers mindset and the defenders, and gave us a more versatile understanding of the subject.
Sources:
http://www.nationalmuseum.af.mil/factsheets/factsheet.asp?id=9722
http://users.telenet.be/d.rijmenants/en/timeline.htm
http://www.muslimheritage.com/topics/default.cfm?ArticleID=372
http://www.cypher.com.au/crypto_history.htm
http://cryptozine.blogspot.com/2008/05/brief-history-of-cryptography.html
Cryptography has a prolific history, spanning most of ancient time. The field of modern cryptography and cryptanalysis is young compared to other fields, as it's birth paralleled the development of computing technology. Throughout history people have needed to disguise their information so that it could travel safely. Roman's used the Caesar cipher, in which each letter of a message was changed for the letter three spaces ahead. In this manner, A became C, and so on for every letter. Various other methods existed spanning different civilizations, and you can take a look at this site I found, some of which are very interesting and worth checking out.
The reason I described cryptography's history in such endearing terms is because the contributions of allied code breakers to the effort in WWII, by some accounts , may have won them the war. As a result, cryptography and cryptoanlysis were held in high esteem in post WWII times, and became funded and regarded as legitimate fields. The NSA was established to further research cryptography and cryptanalysis method in the United States, and in post WWII times was the primary establishment for these fields.
Today computer security is vital. Encryption methods developed by brilliant minds such as Adi Shamir are used in WEP, the most commonly used encryption method for wireless networks. The interest in security comes from both sides of the coin, as for every defense built, there is an attacker in mind. People are constantly exploring encryption methods and security protocols to find vulnerabilities, either to be exploited or for academic purposes. The class I took recently on information security was very well rounded, and I owe a lot of my interest in the subject to my teachers, Tom Austin and Mark Stampp. Their approach to the subject explored both the attackers mindset and the defenders, and gave us a more versatile understanding of the subject.
Sources:
http://www.nationalmuseum.af.mil/factsheets/factsheet.asp?id=9722
http://users.telenet.be/d.rijmenants/en/timeline.htm
http://www.muslimheritage.com/topics/default.cfm?ArticleID=372
http://www.cypher.com.au/crypto_history.htm
http://cryptozine.blogspot.com/2008/05/brief-history-of-cryptography.html
Sunday, November 24, 2013
Artificial Intelligence: A Brief Rundown
Machine learning is a critical part of Artificial Intelligence as we know it. Much of modern AI schemes are developed to statistically gather information on a matter and build on the information received. There are many methods, and some more effective for certain tasks then others. To some, AI seems like an exotic concept, far on the horizon. When I told a friend I was taking a class on AI systems, his first question was "Are you building Skynet?" AI is a reoccurring theme in popular science fiction literature and film, and as such has developed a bit of misconception. Scientists aren't trying to end the world with a super computer run wild, and really aren't even close. However, there are important developments in the field that have allowed for significant improvements in management of big data and regulatory systems through concepts such as machine learning, or Weak AI.
The field of AI research has been around for a long time, having been founded in 1956. The holy grail of AI was then, and is still now, to develop a system that could emulate the human brain, and be capable of reasoning on the same level as a human being. Researcher's initial, and in hindsight optimistic, predictions stated that this machine should exist in less then a generation. The prospect of machines being able to do all of our most dangerous or menial jobs was enchanting, and garnered massive government funding for the time. Researchers had completely underestimated the difficulty of their task, however, and by 1973 funding for the projects had stopped almost completely. Research continued, despite the setbacks of what was known as the AI Winter.
Today, you could say we have come out of the winter and into the spring of AI research. Problems that were deemed unsolvable in the heyday of AI in the 60's have been solved, and applied to various technologies. Milestones include autonomous cars, chess programs, and more recently IBM's Watson.
AI Research has fragmented into a variety of sub fields, focusing on individual problems and concepts. Machine learning is one of these, and has shown to be the most practical field. It is the study of computer algorithms for learning to do things. Data is observed, and the algorithm tries to learn based off this data to do better in a task in the future. The data may be fed manually, but the entire decision making process is emphasized as automatic. There are limits to what problems can be 'learned' in this manner, and the easiest tends be classification problems. These include character recognition, facial recognition, language understanding, spam filtering, and fraud detection.
There are some philosophical issues dealing with AI development, and our society as a whole needs to face these as we approach the holy grail. They are tough questions, and I will readily say I'm not qualified to tackle them! I just know if we make some Strong AI with a godly intellect, I would ask that it be restricted access from our nuclear stockpile.
Sources:
http://www.i-programmer.info/babbages-bag/297-artificial-intelligence.html
http://www.cs.princeton.edu/courses/archive/spr08/cos511/scribe_notes/0204.pdf
http://library.thinkquest.org/2705/history.html
http://aitopics.org/misc/brief-history
The field of AI research has been around for a long time, having been founded in 1956. The holy grail of AI was then, and is still now, to develop a system that could emulate the human brain, and be capable of reasoning on the same level as a human being. Researcher's initial, and in hindsight optimistic, predictions stated that this machine should exist in less then a generation. The prospect of machines being able to do all of our most dangerous or menial jobs was enchanting, and garnered massive government funding for the time. Researchers had completely underestimated the difficulty of their task, however, and by 1973 funding for the projects had stopped almost completely. Research continued, despite the setbacks of what was known as the AI Winter.
Today, you could say we have come out of the winter and into the spring of AI research. Problems that were deemed unsolvable in the heyday of AI in the 60's have been solved, and applied to various technologies. Milestones include autonomous cars, chess programs, and more recently IBM's Watson.
AI Research has fragmented into a variety of sub fields, focusing on individual problems and concepts. Machine learning is one of these, and has shown to be the most practical field. It is the study of computer algorithms for learning to do things. Data is observed, and the algorithm tries to learn based off this data to do better in a task in the future. The data may be fed manually, but the entire decision making process is emphasized as automatic. There are limits to what problems can be 'learned' in this manner, and the easiest tends be classification problems. These include character recognition, facial recognition, language understanding, spam filtering, and fraud detection.
There are some philosophical issues dealing with AI development, and our society as a whole needs to face these as we approach the holy grail. They are tough questions, and I will readily say I'm not qualified to tackle them! I just know if we make some Strong AI with a godly intellect, I would ask that it be restricted access from our nuclear stockpile.
Sources:
http://www.i-programmer.info/babbages-bag/297-artificial-intelligence.html
http://www.cs.princeton.edu/courses/archive/spr08/cos511/scribe_notes/0204.pdf
http://library.thinkquest.org/2705/history.html
http://aitopics.org/misc/brief-history
Sunday, November 17, 2013
History of Computer Science : Some Technologies
The history of Computer Science has
been marked by major innovations in technology. Personally I've lived
through the proliferation of Personal Computers, Smart Phones, and
gaming consoles. This has all happened within the last 30 years, and
marks my experience with technologies brought to consumers through
Computer Science. The origins of computers and Computer Science dates
back much further. I will give a brief rundown of the various tech
that accompanied these advances in Computer Science.
Early Mechanical Calculators
The simplest devices can aid in your calculations, and people have been seeking these aids for over 5 Millennia. The earliest record of an Abacus dates back to 3000 B.C., and has seen an evolution throughout the history of mankind. Just as we seek to advance our technologies today, people have been doing so through the ages. Pictured is a modern version of the abacus, a large step from the ancient counting boards from which it originated. Other mechanical calculators exist is well, such as the slide rule. One which is very interesting is the The Antikythera Mechanism, and worth looking up.
Punched Cards
At the dawn of the 19th century, Joseph-Marie Jacquard created a loom where the pattern to be woven was set by a punch card. He could change the design of the weave without actually changing the design of the loom, marking a significant advancement in automation. It was 30 years later that pioneer Charles Babbage would use these punched cards as a means to store his programs for his Analytical Engine. Later in the century these cards would become the standard for data storage. Machines were designed to process the cards, and using these machines the 1890 US Census was accomplished months ahead of schedule, and far under budget.
The Analytical Engine and Charles Babbage
Charles Babbage is widely renown as the father of computing. His work spanned the first half of the 19th century, and produced some incredible results. He designed a calculator capable of computing numbers with up to eight decimal points, and made plans for a machine that would become the foundation for all modern computers. This was the Analytical Engine, and is considered the first computer. He envisioned a massive brass and steel steam powered computing machine, but the principals were the same. In this machine he had plans to use an ALU, control flow and integrated memory. It was only when electric computers were designed that engineers realized that Charles Babbage had preempted features of their designs by nearly a century.
The Mechanical Office
Before computers, around the 1920's, the office was dominated by three machines, the typewriter, the filing system, and an automated adding machine. Today, all of the functions provided by these devices are still necessary, but have been thoroughly replaced by the computer. The supply of these three devices was controlled by four major companies, Remington Rand, National Cash Register, Burroughs Adding Machine Company, and IBM. It was only when IBM had embraced the new technology of computers that it leapt ahead of the competition. The use of these machines allowed businesses to operate more efficiently, and accurately.
The First Commercial Computers
The Harvard Mark I was an automatic digital computer. It could perform the four basic arithmetic functions, and had subroutines that allowed it to handle logarithms and trigonometric functions. It was one of a kind, and preceded the stored program computers that would become the norm. The EDSAC was the first practical stored program computer. The programs were stored on punched paper tapes.
In 1951, the first UNIVAC was given to the US Census Bureau. This marked the beginning of commercial sales of stored-program computers in the US. The UNIVAC was able to serve multiple purposes, compared to the earlier computers which were made for a singular purpose and owned by their manufacturers.
HP-35
Fast forward to 1972, Hewlett-Packard releases it's HP-35, HP's first pocket calculator. It was the world's first scientific pocket calculator, having trigonometric and exponential functions. It used RPN, could store intermediate solutions into it's solid state memory, and could display entries in scientific notation. It marked a point where computational tools were getting smaller and more advanced, a trend that would continue.
The Xerox Star
The Xerox Star was an early PC built by Xerox in 1941. It was the first computer to integrate a graphical user interface through a monitor, controlled by a mouse and keyboard. It wasn't sold commercially, much to Xerox's dismay today. It greatly influenced the design of the personal computers that would be released in the coming decades.
A video of it in action is here, as through some research I realized I'd never seen the mythical Xerox GUI. For anyone curious, this is a pretty interesting sight!
Xerox Star User Interface (1982) 1 of 2 - YouTube
The Apple Lisa and Personal Computers
From here on we know the story, Apple continued development on the personal computer using a GUI. We see competition, and in the 80's PC' are becoming more common. By the end of the 90's we are hard pressed to find classrooms without PC's, let alone homes. Smart phones are right around the corner, and we are surrounded by computers, whether we would like it or not.
The history of Computing and Computer Science is really interesting, and I enjoyed doing some extra research to read about the figures and machines that made it possible. Personally I'm more interested in the machines, as my own history with computers deals directly with the technologies produced as a result of said Computer Science. I remember my first computer, I'd have to sort through unlabeled 5 1/4" floppy disks just to see what was on them. I was mostly curious as to what could be on these things, more often I was just looking for a game. I hit the constraints of my computer when I found Joust VGA, which I couldn't play because well... I didn't have the appropriate VGA hardware. Being a 6 year old, I had no idea what VGA was, or why it wasn't letting me play this game. I tried my best to learn and understand what these things were, and as a result ignited my passion for technologies and computing. The rest, well, it's history.
Resources :
http://en.wikipedia.org/wiki/History_of_computer_science
http://en.wikipedia.org/wiki/History_of_computing_hardware
http://www.computerhistory.org/
http://www.eingang.org/Lecture/
http://campus.udayton.edu/~hume/Computers/comp2.htm
Punched Cards
The Analytical Engine and Charles Babbage
Charles Babbage is widely renown as the father of computing. His work spanned the first half of the 19th century, and produced some incredible results. He designed a calculator capable of computing numbers with up to eight decimal points, and made plans for a machine that would become the foundation for all modern computers. This was the Analytical Engine, and is considered the first computer. He envisioned a massive brass and steel steam powered computing machine, but the principals were the same. In this machine he had plans to use an ALU, control flow and integrated memory. It was only when electric computers were designed that engineers realized that Charles Babbage had preempted features of their designs by nearly a century.
The Mechanical Office
Before computers, around the 1920's, the office was dominated by three machines, the typewriter, the filing system, and an automated adding machine. Today, all of the functions provided by these devices are still necessary, but have been thoroughly replaced by the computer. The supply of these three devices was controlled by four major companies, Remington Rand, National Cash Register, Burroughs Adding Machine Company, and IBM. It was only when IBM had embraced the new technology of computers that it leapt ahead of the competition. The use of these machines allowed businesses to operate more efficiently, and accurately.
The First Commercial Computers
In 1951, the first UNIVAC was given to the US Census Bureau. This marked the beginning of commercial sales of stored-program computers in the US. The UNIVAC was able to serve multiple purposes, compared to the earlier computers which were made for a singular purpose and owned by their manufacturers.
HP-35
The Xerox Star
A video of it in action is here, as through some research I realized I'd never seen the mythical Xerox GUI. For anyone curious, this is a pretty interesting sight!
Xerox Star User Interface (1982) 1 of 2 - YouTube
The Apple Lisa and Personal Computers
From here on we know the story, Apple continued development on the personal computer using a GUI. We see competition, and in the 80's PC' are becoming more common. By the end of the 90's we are hard pressed to find classrooms without PC's, let alone homes. Smart phones are right around the corner, and we are surrounded by computers, whether we would like it or not.
The history of Computing and Computer Science is really interesting, and I enjoyed doing some extra research to read about the figures and machines that made it possible. Personally I'm more interested in the machines, as my own history with computers deals directly with the technologies produced as a result of said Computer Science. I remember my first computer, I'd have to sort through unlabeled 5 1/4" floppy disks just to see what was on them. I was mostly curious as to what could be on these things, more often I was just looking for a game. I hit the constraints of my computer when I found Joust VGA, which I couldn't play because well... I didn't have the appropriate VGA hardware. Being a 6 year old, I had no idea what VGA was, or why it wasn't letting me play this game. I tried my best to learn and understand what these things were, and as a result ignited my passion for technologies and computing. The rest, well, it's history.
Resources :
http://en.wikipedia.org/wiki/History_of_computer_science
http://en.wikipedia.org/wiki/History_of_computing_hardware
http://www.computerhistory.org/
http://www.eingang.org/Lecture/
http://campus.udayton.edu/~hume/Computers/comp2.htm
Subscribe to:
Comments (Atom)