And here we are at the end of the spring 2011 semester, Communicating Science ending with it. As such, this will be my last post for the blog project and for ENIAC Beyond. I may start it back up again when I get settled into the computer science industry, but that is probably far into the future. Maybe I'll be able to post about technologies and systems I'm working on instead of finding articles and stories about great things people have already done.
Writing has never really been a strong suit of mine. It's always been a struggle, staring at a computer screen for hours to put down a few well placed sentences. I still remember writing my first two blog posts, a simple welcoming post and an overly technical piece about the progress of processing power over the decades. I had stayed up til 4 in the morning to polish off the posts, not a very good feeling after looking back and seeing how entrenched in the deficit model the processing post turned out to be.
I caught a lucky break with Watson's appearance on Jeopardy! The event was easily observable and entertaining, able to catch attention quickly. These posts were where I slowly started to get in to the rhythm of posting with engaging stories and events. Proceeding posts tried to take from intractable sources or tried to be a bit more than a wall of text to bore the reader. The writing process never really got any easier, each time I sat down to write a blog post was a long drawn out process, but I was feeling better about my posts as time went on.
Thanks for reading my blog. It's definitely been a learning experience and a good introduction into the difficulty and importance of effective science communication. Hopefully I can take what I learned here and apply it to all that I do in the future.
For now, I'm signing off from the blogging world. You stay classy, San Diego.
Wednesday, May 4, 2011
Monday, May 2, 2011
Ludum Dare: Game Creation in a Weekend
Just this past weekend, a community known as Ludum Dare held their twentieth game creation competition, all in the span of 48 hours. Over the weekend, each game was created by a single programmer who generated each and every asset to the game based around a central theme: graphics, sounds, and player interactions from scratch and pulls it all together for the competition. There are no prizes for winning but it gives a great reason for people to come together and just create some small, quirky, hopefully fun games. Over 350 games were created over the weekend. If you have the time during these last few weeks of school, fire one or two up and give them a shot (some may require some special compilation to work, but others run through flash in your web browser). You may be pleasantly surprised.
More Algorithm Visualizations
A little over a week ago I wrote a post about a college in Europe that used Hungarian folk dance to explain a few basic sorting algorithms. Today, I ran into another site that does an excellent job showing exactly how different algorithms work, maybe not with as much toe tapping as with the folk dance but effective none the less. Associate Professor David Galles from the University of San Francisco filled his site with java script animations of various algorithms, each very important for a solid base in computer science.
First are data structures such as queues and stacks, where you can push values into the structure and pop them off. Stacks are extremely important for programming as they are used to keep track of function calls (when a new function is called, it is placed on top of the calling function. The called function performs its tasks and then is popped off the stack, giving precedence back to the calling function). Queues can be used as a method of resource allocation, giving out resources in a first come first serve approach (which is a method that is far too basic for real resource allocation, but that's not the point).
Indexing is a complex issue dealing with storing and retrieving information. Given a piece of information, what is the best way to store it quickly while being able to retrieve it just as quickly. Each method uses a different point from which to start from and then moves from there, such as the binary tree uses the first node as a root node and uses comparisons to place additional data, passing it down the left when it is smaller than the current node or to the right if it is larger. Hash mapping uses the value of the data to place it in different bins, where hopefully it's quicker to step through a portion of the data than the entire set of data.
The sorting algorithms are mostly what I've showed before with a few new ones added. Radix and bucket sort are pretty interesting fare, using number placement as well as value for sorting the values. A problem that arises with these new sorting algorithms is they require additional resources to hold partially sorted data, increasing overhead costs. Heaps are a type of data storage that keeps the largest value as the root node, with each child smaller than its parents and can be used as a sorting structure using heapsort.
The graph algorithms are used when given a graph of nodes with some sort of cost associated with moving between nodes. A depth first search will find a path between a start and an end node by moving until it reaches a dead end, then backtracks to the previous node until it finds a different route. A breadth first search sends out paths every direction, each level of depth being explored simultaneously. Dijkstra's and Prim's algorithms are generally used to find shortest or least costly paths between two nodes. I found another site that gives a good visualization on how Dijkstra's algorithm works.
These visualizations can give an excellent view in to how these algorithms actually function. Playing around with them can help clear up any uncertainties on how these algorithms work, so give it a shot.
First are data structures such as queues and stacks, where you can push values into the structure and pop them off. Stacks are extremely important for programming as they are used to keep track of function calls (when a new function is called, it is placed on top of the calling function. The called function performs its tasks and then is popped off the stack, giving precedence back to the calling function). Queues can be used as a method of resource allocation, giving out resources in a first come first serve approach (which is a method that is far too basic for real resource allocation, but that's not the point).
Indexing is a complex issue dealing with storing and retrieving information. Given a piece of information, what is the best way to store it quickly while being able to retrieve it just as quickly. Each method uses a different point from which to start from and then moves from there, such as the binary tree uses the first node as a root node and uses comparisons to place additional data, passing it down the left when it is smaller than the current node or to the right if it is larger. Hash mapping uses the value of the data to place it in different bins, where hopefully it's quicker to step through a portion of the data than the entire set of data.
The sorting algorithms are mostly what I've showed before with a few new ones added. Radix and bucket sort are pretty interesting fare, using number placement as well as value for sorting the values. A problem that arises with these new sorting algorithms is they require additional resources to hold partially sorted data, increasing overhead costs. Heaps are a type of data storage that keeps the largest value as the root node, with each child smaller than its parents and can be used as a sorting structure using heapsort.
The graph algorithms are used when given a graph of nodes with some sort of cost associated with moving between nodes. A depth first search will find a path between a start and an end node by moving until it reaches a dead end, then backtracks to the previous node until it finds a different route. A breadth first search sends out paths every direction, each level of depth being explored simultaneously. Dijkstra's and Prim's algorithms are generally used to find shortest or least costly paths between two nodes. I found another site that gives a good visualization on how Dijkstra's algorithm works.
These visualizations can give an excellent view in to how these algorithms actually function. Playing around with them can help clear up any uncertainties on how these algorithms work, so give it a shot.
Thursday, April 28, 2011
Program Like in the Movies
If you've ever wanted to wanted to hack like in the movies but never thought your programming skills were up to snuff or couldn't type a thousand words per minute, have some fun with Hacker Typer. Select the file you want to generate and then go nuts on the keyboard. In seconds you'll transform into a Hollywood programmer, typing perfect code impossibly fast. If the code generated seems pretty cryptic, don't worry, even I had to look up to figure out what the second and third file choice was. Mobile Substrate is used by third party groups to patch system functions on the iPhone and fini.sh is a type of Linux shell, or operating systems environment, but I couldn't find any real information about it.
But the first choice is the one I want to talk about, the Linux kernel. A kernel is the core of the operating system, the bridge between software applications and hardware level devices. All user generated resource requests, things like saving to disk space or loading up a program to be placed in memory for quick access, are directed through the kernel. The kernel then takes these requests and generate systems calls to these devices and returns the necessary information back to the application to be passed on to the user.
Direct kernel interaction can be dangerous without the proper knowledge. Kernels have the least amount of restrictions on what they can and cannot access in the computer. They act as resource managers for the central processing unit, memory, and I/O devices, keeping applications and devices from interfering with each other that could cause catastrophic crashes while doling out resources for system requests quickly and efficiently.
Kernel coding is considered one of the more difficult of any coding practice, as it requires a strict finesse to utilize the raw power of the kernel while not damaging any critical areas of the computer. So using Hacker Typer as a quick sandbox just for fun can be a nice distraction.
But the first choice is the one I want to talk about, the Linux kernel. A kernel is the core of the operating system, the bridge between software applications and hardware level devices. All user generated resource requests, things like saving to disk space or loading up a program to be placed in memory for quick access, are directed through the kernel. The kernel then takes these requests and generate systems calls to these devices and returns the necessary information back to the application to be passed on to the user.
Direct kernel interaction can be dangerous without the proper knowledge. Kernels have the least amount of restrictions on what they can and cannot access in the computer. They act as resource managers for the central processing unit, memory, and I/O devices, keeping applications and devices from interfering with each other that could cause catastrophic crashes while doling out resources for system requests quickly and efficiently.
Kernel coding is considered one of the more difficult of any coding practice, as it requires a strict finesse to utilize the raw power of the kernel while not damaging any critical areas of the computer. So using Hacker Typer as a quick sandbox just for fun can be a nice distraction.
Tuesday, April 26, 2011
Computer Greats: Grace Hopper
One I the things I love about today's technology is its ability to pull things that would seem like they were lost in the void of time, never to be seen again. I was wandering around and I happen to stumble upon something I thought I'd never see. An uploaded set of two videos on Youtube of a 60 Minutes interview of Grace Hopper from nearly thirty years ago.
Grace Hopper was one of the greatest pioneers of the early computer science and a powerful force in communicating computer science. She was one of the programmers for the Harvard Mark I in 1944, which is landmarked as the beginning of the modern computer era. In 1949, she assisted in the development of the UNIVAC I, the very first commercially produced computer. While working on the UNIVAC system, Hopper created the very first compiler for electronic computer systems. Compilers are incredibly important for modern programmers as they allow to write human readable code which is then translated by the compiler into machine code that the computer is able to run. Later, Hopper participated in the creation of the widely used language COBOL, based on a simpler language she had created earlier. After her retirement from the Navy, Hopper went on a lecture circuit, educating students, military personnel, and business leaders about computer science.
0:03 Grace laces her talks with a bit of humor and personal history, techniques we've seen as positive speaking techniques in Communicating Science.
1:35 Grace describes the current frame of the computer science revolution, remarking it is still in its infancy
3:49 Grace hands out bits of wire to her crowds to give them an understanding of a billion. The wire is cut to the distance light travels in a vacuum in a billionth of a second. She then pulls out a cord of wire nearly a thousand feet long to compare to a micro second.
The second part of the interview turns a bit away from her contributions and work in the field of computer science and more towards military, politics, and gender.
Grace Hopper is one of the most important figures in the dawning of the modern computer era. It was a wonderful find to see these videos uploaded of such a keen lady.
Grace Hopper was one of the greatest pioneers of the early computer science and a powerful force in communicating computer science. She was one of the programmers for the Harvard Mark I in 1944, which is landmarked as the beginning of the modern computer era. In 1949, she assisted in the development of the UNIVAC I, the very first commercially produced computer. While working on the UNIVAC system, Hopper created the very first compiler for electronic computer systems. Compilers are incredibly important for modern programmers as they allow to write human readable code which is then translated by the compiler into machine code that the computer is able to run. Later, Hopper participated in the creation of the widely used language COBOL, based on a simpler language she had created earlier. After her retirement from the Navy, Hopper went on a lecture circuit, educating students, military personnel, and business leaders about computer science.
0:03 Grace laces her talks with a bit of humor and personal history, techniques we've seen as positive speaking techniques in Communicating Science.
1:35 Grace describes the current frame of the computer science revolution, remarking it is still in its infancy
3:49 Grace hands out bits of wire to her crowds to give them an understanding of a billion. The wire is cut to the distance light travels in a vacuum in a billionth of a second. She then pulls out a cord of wire nearly a thousand feet long to compare to a micro second.
The second part of the interview turns a bit away from her contributions and work in the field of computer science and more towards military, politics, and gender.
Grace Hopper is one of the most important figures in the dawning of the modern computer era. It was a wonderful find to see these videos uploaded of such a keen lady.
Wednesday, April 20, 2011
Sort Your Heart Out
Sorting is a fundamental issue for computer scientists. Given a list of unordered objects and a way to compare them, what is the best way to sort them into a logical order? There are many different answers to this question out there, each ranging in simplicity and efficiency. The Sapientia University of Romania took a few of the most simple sorting algorithms and showed how they work in a fairly novel way, through the magic of dance. Using a local Central European folk dance team and a folk band, the team is able to show off some of the more simple sorting algorithms in a little livelier fashion that most demonstrations of sorting.
The first and simplest of sorts is the bubble sort. Bubble sort starts at the beginning of the list and compares the first two objects. It does nothing if those two objects are in the correct order or swaps them if they are not. The algorithm then compares the second and third elements in the list, swapping when necessary and forcing the higher number up the list, "bubbling" it towards where it needs to go. This iteration continues until the sort reaches the end of the list where it goes back to the beginning, starting the sort all over again. This is repeated until an entire pass across the list has no swaps. The algorithm is complete and ceases operation.
Insertion sort is another simple algorithm. Starting at the beginning of the list, the algorithm "separates" the first element into what is considered a sorted list, where by itself, the element is sorted. It then takes the next element in the list and places it into the correct location inside the sorted list. This continues down the list of objects until a fully sorted list is produced.
Akin to the bubble sort and insertion sort, shell sort compares two elements half the size of the list together and swaps if needed. If the elements were swapped, then the gap is used again to compare the elements before to check for needed swaps. After comparing all elements with this gap, it once again iterates over the list using a smaller gap and continues until the list is sorted, the final iteration through the list being exactly like insertion sort.
Selection sort takes a slightly different approach, only swapping when it knows it has the correct value for that position in the list. The algorithm iterates over the entire list to find the lowest value in the list, then swaps that value with whatever element is in the first position of the list. The algorithm then repeats the process with the next lowest value in the list, placing it in the second position in the list. The process is continued until the list is sorted.
The first and simplest of sorts is the bubble sort. Bubble sort starts at the beginning of the list and compares the first two objects. It does nothing if those two objects are in the correct order or swaps them if they are not. The algorithm then compares the second and third elements in the list, swapping when necessary and forcing the higher number up the list, "bubbling" it towards where it needs to go. This iteration continues until the sort reaches the end of the list where it goes back to the beginning, starting the sort all over again. This is repeated until an entire pass across the list has no swaps. The algorithm is complete and ceases operation.
Insertion sort is another simple algorithm. Starting at the beginning of the list, the algorithm "separates" the first element into what is considered a sorted list, where by itself, the element is sorted. It then takes the next element in the list and places it into the correct location inside the sorted list. This continues down the list of objects until a fully sorted list is produced.
Akin to the bubble sort and insertion sort, shell sort compares two elements half the size of the list together and swaps if needed. If the elements were swapped, then the gap is used again to compare the elements before to check for needed swaps. After comparing all elements with this gap, it once again iterates over the list using a smaller gap and continues until the list is sorted, the final iteration through the list being exactly like insertion sort.
Selection sort takes a slightly different approach, only swapping when it knows it has the correct value for that position in the list. The algorithm iterates over the entire list to find the lowest value in the list, then swaps that value with whatever element is in the first position of the list. The algorithm then repeats the process with the next lowest value in the list, placing it in the second position in the list. The process is continued until the list is sorted.
Monday, April 18, 2011
On a Lighter Note
A few weeks ago I wrote a blog post about a comic I'd run across in Dinosaur Comics which talked about lazy posting. Dinosaur Comics are definitely a good read but doesn't have much connection to computer science with the exception of a wild one here or there. I thought it would be a good idea to share some other comics I've run across that largely deal with computer science and I enjoy a lot.
Abstruse Goose is the most recent one I've read. It's a comic with no real overarching presence but has a pretty strong connection to computer science, art, and mathematics. This comic sums up my college career pretty well.
xkcd by Randall Munroe is a well known and well acclaimed web comic that has a very heavy focus on math, physics, and computer science, although it does get sidetracked sometimes with some rather sappy comics that can feel a bit off track. He also has a lengthy blog of his own with tons of mathematics and science to read to your hearts content.
Saturday Morning Breakfast Cereal by Zach Weiner is my favorite of the comics listed here. Updated daily, Weiner is able to produce consistently funny comics over a wide range of subjects, from philosophy to economics, from politicians to scientists, and inject enough of his own goofy humor into it that it's always a great read. He even has a comic about communicating science!
Abstruse Goose is the most recent one I've read. It's a comic with no real overarching presence but has a pretty strong connection to computer science, art, and mathematics. This comic sums up my college career pretty well.
Abstruse Goose |
xkcd |
Saturday Morning Breakfast Cereal |
Tuesday, April 12, 2011
Saturday, April 9, 2011
Pixel Coding
What we have here is what is called pixel coding. More of an art than an actual coding language, pixel coding takes advantage of the common language between all computer files, binary. Colors can be encoded in differing amounts of bits, 12 or 24 or 32 or more, depending on what is needed by the user. By controlling red, blue, and green values along with opacity, it's possible to generate any 32 bit binary string needed. Along with an understanding of application programming interfaces, or APIs, and knowing the binary strings needed to generate different function calls, it's possible to translate to them directly. A system's API is the set of instructions that allow the user to utilize the functionality build into the system. Aside from some header information depending on the file extension, changing the file from .raw to .com shouldn't change the binary strings located within. The API generated by the file created a texture tunnel from nothing but pixels.
Wednesday, April 6, 2011
When Random Isn't Random Enough
For anyone who has done a bit of programming, there is a good chance that you've had to use random numbers. They can been very helpful for simulating the chaotic events we need for experiments or programs, such as basic AI for a game or natural events. But an important thing to know about most random numbers generated by a computer are not random at all but psudorandom.
Random numbers generated by a computer require a seed, or some number put into an algorithm to produce the random number. This seed can come from many different places depending on the random number generator. The most basic generator uses a static number each time, producing the same exact string of random numbers each time, not very random. The next step up uses the system clock to pull the current time based in seconds from midnight of January 1, 1960. This is a decent way to to "randomize" the seed, but numbers generated within the same second pull from the same seed, which may cause problems. Other random number generators use chaotic inputs from the user to determine seeds, such as mouse movements or keyboard inputs. Mike Ash has a good blog post about the different tiers of random number generators here.
If you desire true random numbers, then look no farther than Random.org. At Random.org, they use atmospheric noise to generate random numbers in a variety of forms, from simple integers to die rolls or cards drawn from a deck. They also have a very good piece detailing psudorandom number generators and how they compare with true random number generators. You could always use it to generate your passwords or choose your next vacation spot, probably better than throwing darts at a map.
Random numbers generated by a computer require a seed, or some number put into an algorithm to produce the random number. This seed can come from many different places depending on the random number generator. The most basic generator uses a static number each time, producing the same exact string of random numbers each time, not very random. The next step up uses the system clock to pull the current time based in seconds from midnight of January 1, 1960. This is a decent way to to "randomize" the seed, but numbers generated within the same second pull from the same seed, which may cause problems. Other random number generators use chaotic inputs from the user to determine seeds, such as mouse movements or keyboard inputs. Mike Ash has a good blog post about the different tiers of random number generators here.
If you desire true random numbers, then look no farther than Random.org. At Random.org, they use atmospheric noise to generate random numbers in a variety of forms, from simple integers to die rolls or cards drawn from a deck. They also have a very good piece detailing psudorandom number generators and how they compare with true random number generators. You could always use it to generate your passwords or choose your next vacation spot, probably better than throwing darts at a map.
Wednesday, March 30, 2011
Test Your Mettle
If you've ever finished up your work for school or your job and thought that you needed more questions to solve, say hello to Project Euler. Here's a brief description from the Project Euler website:
So go ahead, create an account and start hacking away with your favorite language. One by one, you'll solve the 330 problems and get on a roll. Hell, you might even learn a thing or two.
You're free to use any language that you feel comfortable with as each question is looking for a single numeric answer that you enter manually. Each question is designed to be solvable within a minute with a solid implementation of mathematical algorithms and good use of programming trick. After you solve a problem, either by brute forcing the answer or utilizing your own devised algorithms, you will have access to a message board about that problem, giving you tips and tricks to make your program better which you can use for later problems.What is Project Euler?
Project Euler is a series of challenging mathematical/computer programming problems that will require more than just mathematical insights to solve. Although mathematics will help you arrive at elegant and efficient methods, the use of a computer and programming skills will be required to solve most problems.
The motivation for starting Project Euler, and its continuation, is to provide a platform for the inquiring mind to delve into unfamiliar areas and learn new concepts in a fun and recreational context.
So go ahead, create an account and start hacking away with your favorite language. One by one, you'll solve the 330 problems and get on a roll. Hell, you might even learn a thing or two.
Wednesday, March 23, 2011
Second Self Evaluation
Self Review #2
Second time around and this is the next self evaluation. This period I had let myself be bogged down by other classes so my update schedule suffered quite a bit. I tried to catch up during spring break so quite a posts piled up at the end.
Number and topic of posts
I had 6 total content posts for this evaluation period. This is about 1.5 posts per week not counting the week we got off from Spring Break. This isn't a strong amount of posts and could definitely improve. My posts are generally fairly long with few if any one or two line posts. I definitely need to sprinkle some short and to the point posts along with my longer posts. For now, I'm still sitting in the C range when it comes to post count.
Content of posts
The content of my posts have improved over the posts made during the first evaluation period. I don't think I utilize the deficit model nearly as much and as a result my posts are more informative and useful for those who read them. The grammar of the posts remained constant and hopefully was never really a problem. I think the content is around a B level, maybe closer to a B+.
Readers on my blog
The posts at the beginning of the evaluation period netted quite a few views and comments. I don't think many people were commenting a lot during spring break so my newer posts don't have too many comments except for the most recent. I've shared my blog with a few friends to increase readership outside of the class. This falls in the C/B range of the rubric.
Comments on other blogs
This time around I spent more time going to blogs I hadn't visited yet and tried to leave a comment or two on them. I still mostly frequent Zach's Technology Complicated and Carlos' ScanMeIn and leave the majority of my comments there. I'm probably sitting around a C in the rubric for this.
Misc
Every post I always try to include helpful links, pictures, and videos that may be useful to the reader. I've kept a consistent pace to keep references available for the reader in hopes to make enough information available. I think I've been around a B range for these posts for miscellaneous requirements.
Overall
The process of creating a blog post hasn't been much easier for me, I still sit in front my computer for a few hours to knock out a 3 or 4 paragraph post but the content of my posts have improved significantly. Keeping up with and exceeding the number of posts I should be creating is my biggest hurdle right now and what I need to work on the most now. I think my blog is around a low B in the rubric.
Second time around and this is the next self evaluation. This period I had let myself be bogged down by other classes so my update schedule suffered quite a bit. I tried to catch up during spring break so quite a posts piled up at the end.
Number and topic of posts
I had 6 total content posts for this evaluation period. This is about 1.5 posts per week not counting the week we got off from Spring Break. This isn't a strong amount of posts and could definitely improve. My posts are generally fairly long with few if any one or two line posts. I definitely need to sprinkle some short and to the point posts along with my longer posts. For now, I'm still sitting in the C range when it comes to post count.
Content of posts
The content of my posts have improved over the posts made during the first evaluation period. I don't think I utilize the deficit model nearly as much and as a result my posts are more informative and useful for those who read them. The grammar of the posts remained constant and hopefully was never really a problem. I think the content is around a B level, maybe closer to a B+.
Readers on my blog
The posts at the beginning of the evaluation period netted quite a few views and comments. I don't think many people were commenting a lot during spring break so my newer posts don't have too many comments except for the most recent. I've shared my blog with a few friends to increase readership outside of the class. This falls in the C/B range of the rubric.
Comments on other blogs
This time around I spent more time going to blogs I hadn't visited yet and tried to leave a comment or two on them. I still mostly frequent Zach's Technology Complicated and Carlos' ScanMeIn and leave the majority of my comments there. I'm probably sitting around a C in the rubric for this.
Misc
Every post I always try to include helpful links, pictures, and videos that may be useful to the reader. I've kept a consistent pace to keep references available for the reader in hopes to make enough information available. I think I've been around a B range for these posts for miscellaneous requirements.
Overall
The process of creating a blog post hasn't been much easier for me, I still sit in front my computer for a few hours to knock out a 3 or 4 paragraph post but the content of my posts have improved significantly. Keeping up with and exceeding the number of posts I should be creating is my biggest hurdle right now and what I need to work on the most now. I think my blog is around a low B in the rubric.
Tuesday, March 22, 2011
If We Just Technobabble the Technobabble...
Suspension of disbelief is an important and powerful thing to have to enjoy certain movies or TV shows. Crazy physics or wonky engineering don't bother me terribly much. Techonobabble, as popularized by Star Trek and seen almost every nowadays, can be an effective means to solve some crazy situation with equally crazy actions with logic that is so impenetrable that it sounds legitimate. However, when I notice these things in my field of study and interest my suspension of disbelief often goes right out the window.
This isn't the 1980s anymore, the computer is not some black magic box that is both unknowable in its workings yet so simple a few key presses can do seemingly anything. Why do movies and television shows continue to treat it like such like such an abstract concept? It seems like they don't want to alienate those with limited computer knowledge by remaining safely within the confines set by old technology such as text displays or nonexistent mouse use, people slam on the keyboard and stuff gets done. This is what I see and feel when this happens.
Near the end of Independence Day, Jeff Goldblum's character uploads a virus from his Mac to the alien ship, crippling its defenses. Overlooking the fact that they used the printer serial port to somehow wirelessly upload the virus to the ship, the sheer headache that would be needed to fix all compatibility issues (language, platform, endian, would it even use bits?, etc.) between the two systems would be astronomical yet is created and implemented quickly. Fired up and sent (with the uploading virus bar) just in time to save the day.
I guess calling out CSI for taking liberties with technology is a bit old hat but the sheer hamfistedness of this clip astounds me. The same thing happens with cybercrime committed on TV or in movies, be it War Games, Hackers, or Firewall; throw together a few technical sounding terms to the point it sounds menacing or helpful, you have your story or solution. Also, I wish we had the technology to turn a blurry six pixel image of a face into a high definition portrait.
I'm guessing why bad computer technology bothers me so much is because I'm so close to the subject. If I were an engineer or a physicist other parts of television would bother me more than goofy computer lingo strung together.
This isn't the 1980s anymore, the computer is not some black magic box that is both unknowable in its workings yet so simple a few key presses can do seemingly anything. Why do movies and television shows continue to treat it like such like such an abstract concept? It seems like they don't want to alienate those with limited computer knowledge by remaining safely within the confines set by old technology such as text displays or nonexistent mouse use, people slam on the keyboard and stuff gets done. This is what I see and feel when this happens.
Uploading a computer virus to the alien mothership in Independence Day |
I guess calling out CSI for taking liberties with technology is a bit old hat but the sheer hamfistedness of this clip astounds me. The same thing happens with cybercrime committed on TV or in movies, be it War Games, Hackers, or Firewall; throw together a few technical sounding terms to the point it sounds menacing or helpful, you have your story or solution. Also, I wish we had the technology to turn a blurry six pixel image of a face into a high definition portrait.
I'm guessing why bad computer technology bothers me so much is because I'm so close to the subject. If I were an engineer or a physicist other parts of television would bother me more than goofy computer lingo strung together.
Sunday, March 20, 2011
Life Electronic: Straight to the Moon
Sometimes, it's easy to forget how integrated computers have become in our society in the past few decades. Computers are integral parts of our lives even when we're not aware of it. In such cases, it's good at times to step back and understand some of the things we did before computers were an indivisible part of our lives. Of course much astounding work was done before computers ever really came into the limelight in the 1940s and 50s, but most of it has been repeated and replicated ad nauseam. However, there is a recent event first achieved on July 20, 1969 that hasn't been replicated since December 11, 1972. I am, of course, speaking about the moon landings. One of the most amazing things I find about the moon landings is we were able to accomplish this:
with this:
For those unfamiliar with slide rules (I'd wager most of the class has little to no experience with them, I know I've never played with one myself), there are a few things you need to know that make this accomplishment all the more astounding. Most slide rules are precise to two significant digits while leaving enough room for the user to approximate the third digit. Slide rules did not keep track of magnitude; that had to be kept track of by the user. Calculating 4*5 would yield the same result as 4000*0.5. To those of us used to calculators (pocket calculators weren't produced until the early 1970s) and personal computers, these devices seem ancient, almost primitive, but for the daring astronauts of the day, they were enough for many of the calculations the engineers were faced with, from launch to landing.
Don't think that computers were completely absent from sending men to the moon. Larger batch calculations would be encoded on punch cards are passed through IBM 360s computers. The Apollo Guidance Computer was used on almost every Apollo flight to the moon for guidance, navigation, and control. Two AGCs were used in each Apollo mission, one in the command module and one for lunar module. The AGC was one of the first computers to use integrated circuits with a grand total of 2800 circuits for the logic controllers; even simple cellphones are more powerful than those 70 pound devices and could probably do most of the computer calculations done by every computer in 1969 by itself.
This achievement is truly a testament to these days. With today's standards of technology and risk management, I'm sure most people would think we were sending those men to their deaths (which I found out was prepared for in case the worst happened and the men were stranded on the moon). Remarkable times for remarkable men.
with this:
The astronauts of the Apollo 11 mission carried pocket slide rules with them into space. |
For those unfamiliar with slide rules (I'd wager most of the class has little to no experience with them, I know I've never played with one myself), there are a few things you need to know that make this accomplishment all the more astounding. Most slide rules are precise to two significant digits while leaving enough room for the user to approximate the third digit. Slide rules did not keep track of magnitude; that had to be kept track of by the user. Calculating 4*5 would yield the same result as 4000*0.5. To those of us used to calculators (pocket calculators weren't produced until the early 1970s) and personal computers, these devices seem ancient, almost primitive, but for the daring astronauts of the day, they were enough for many of the calculations the engineers were faced with, from launch to landing.
Don't think that computers were completely absent from sending men to the moon. Larger batch calculations would be encoded on punch cards are passed through IBM 360s computers. The Apollo Guidance Computer was used on almost every Apollo flight to the moon for guidance, navigation, and control. Two AGCs were used in each Apollo mission, one in the command module and one for lunar module. The AGC was one of the first computers to use integrated circuits with a grand total of 2800 circuits for the logic controllers; even simple cellphones are more powerful than those 70 pound devices and could probably do most of the computer calculations done by every computer in 1969 by itself.
This achievement is truly a testament to these days. With today's standards of technology and risk management, I'm sure most people would think we were sending those men to their deaths (which I found out was prepared for in case the worst happened and the men were stranded on the moon). Remarkable times for remarkable men.
Thursday, March 17, 2011
Man vs Machine: Watson Aftermath
If your were one of the twelve million viewers last month watching IBM's supercomputer Watson competing on Jeopardy! against champions Brad Rutter and Ken Jennings, you saw a rather dominating performance from Watson. Watson finished the first game at $35734 and completed the three day, two game championship with a total of $77147 with Ken Jennings finishing second at $24000 and Brad Rutter with $21600. IBM claimed the million dollar prize for the championship, donating the money to charities World Vision and World Community Grid. Jennings took home $300000, donating half to VillageReach, and Rutter claimed the $200000 prize, donating half to the Lancaster County Community Foundation.
Watson lead a commanding performance through most of the games. Watson would only buzz in when it had a confidence level above a certain threshold but its buzz was lightning fast. The only times Jennings or Rutter were able to chime in were either when Watson had low confidence in its answer or when Rutter or Jennings anticipated the buzz and beat Watson to the punch. Watson also played very logically, if a bit bizarrely to human strategies. It would hop around the board searching for the Daily Doubles and when it found them it would bet very precise amounts based off of the current state of the game, wagering $6435 on an earlier Daily Double and $17973 on the Final Jeopardy on the third day.
Not all of Watson's performance was perfect. Watson couldn't "hear" what the other contestants buzzed in with, if Ken or Brad rang in first and answered incorrectly, Watson would sometimes buzz in and give the same answer. At the end of the first game, I'm sure much to the chagrin of its creators, Watson's answer of "What is Toronto?????" to the Final Jeopardy category "U.S. Cities" was a bit off the mark. The presence of the five question marks displayed a very uncertain answer from Watson. The IBM creators said they didn't give much weight of choosing answers to the category title because Jeopardy! writers would often use jokes or puns in the titles. Unfortunately for Watson's human competitors, it wagered only $947 of its $36681 current first day total, not giving an ounce of respite for the challengers. For those interested, here is the answer that stumped Watson:
Ken Jennings finishing off Final Jeopardy with a bit of humor |
Watson lead a commanding performance through most of the games. Watson would only buzz in when it had a confidence level above a certain threshold but its buzz was lightning fast. The only times Jennings or Rutter were able to chime in were either when Watson had low confidence in its answer or when Rutter or Jennings anticipated the buzz and beat Watson to the punch. Watson also played very logically, if a bit bizarrely to human strategies. It would hop around the board searching for the Daily Doubles and when it found them it would bet very precise amounts based off of the current state of the game, wagering $6435 on an earlier Daily Double and $17973 on the Final Jeopardy on the third day.
Not all of Watson's performance was perfect. Watson couldn't "hear" what the other contestants buzzed in with, if Ken or Brad rang in first and answered incorrectly, Watson would sometimes buzz in and give the same answer. At the end of the first game, I'm sure much to the chagrin of its creators, Watson's answer of "What is Toronto?????" to the Final Jeopardy category "U.S. Cities" was a bit off the mark. The presence of the five question marks displayed a very uncertain answer from Watson. The IBM creators said they didn't give much weight of choosing answers to the category title because Jeopardy! writers would often use jokes or puns in the titles. Unfortunately for Watson's human competitors, it wagered only $947 of its $36681 current first day total, not giving an ounce of respite for the challengers. For those interested, here is the answer that stumped Watson:
"This U.S. city's largest airport is named for a famous World War II hero, its second largest for a famous World War II battle."Overall, I was incredibly impressed by IBM's showing on Jeopardy! Watson showed a depth and understand of the human language that could have a great number of implications. Being able to parse a question and delve through terabytes of information and return an answer in a few seconds is a critical step towards an era of intelligent, helpful computers. IBM wanted to show off what they could do, and show they did.
Wednesday, March 2, 2011
Class Discussion: T-Rex has Something to Say
During class we've been talking about various methods to engage users and encourage dialogue between the writer and the reader. From controversial topics to personal interest in a story, each method has its merits and flaws. Some are more situational than others while other techniques are more universal and can be used in a variety of settings. However, T-Rex from Dinosaur Comics has something to say about one of the more common techniques found throughout the medium, asking questions.
T-Rex makes a valid point (even if his point wanders). What's the point of asking "What do you think?" about a topic. Most everyone thinks something about everything, prompting them with a lame question doesn't really help at all. The questions posed by an author should try to be as thought provoking as the article they are about.
Now, time to counter my own post by asking a few questions. What techniques have you found useful for creating a discussion about your topics. How much is too much going beyond the article to try to engage the reader? And most important of all, if you haven't done so already, why aren't you reading the entire archive of Dinosaur Comics?
T-Rex makes a valid point (even if his point wanders). What's the point of asking "What do you think?" about a topic. Most everyone thinks something about everything, prompting them with a lame question doesn't really help at all. The questions posed by an author should try to be as thought provoking as the article they are about.
Now, time to counter my own post by asking a few questions. What techniques have you found useful for creating a discussion about your topics. How much is too much going beyond the article to try to engage the reader? And most important of all, if you haven't done so already, why aren't you reading the entire archive of Dinosaur Comics?
Wednesday, February 23, 2011
Scientific Programming and Non Computer Scientists
Increasingly, scientists nowadays are required to code and test their own programs for their area of scientific research. Nature posted an article about recent trends and common pitfalls that scientists run into when they generate their own code. The article found that most scientists have no formal education when it comes to coding and most everything they know is self taught.
Various problems can arise from a lack of formal training while coding. Program breaking bugs may slow down the scientist more than a programmer. Even if the the programs works, the code may contain many small bugs that can alter the results in imperceptible ways, such as a research team out of Scrips Research Institute in California that had to redact 5 published papers because there was a flipped minus sign in the program that altered their output.
Many scientists verify their code using validation testing, where the scientist would input known values with a known output and compares that to what the program outputs. This method will often miss tiny mistakes that make themselves apparent in other data. Many programmers use more rigorous testing methods, breaking code in to small chucks and testing each chuck individually, then testing how the chucks are put back together.
Requiring verbose commenting is also a practice that many self taught scientists may ignore. A lack of comments makes understanding or changing code more difficult. A lack of comments also makes using code generated by someone else or someone else using your code confusing and nearly impractical. Encouraging scientists to form groups to share and talk about their programs can foster better communication between scientists and better code.
Seeing as the class is filled with majors other than computer science, I had a few questions about your experiences with coding. How often are you asked to program in your other classes or job? What kind if training have you received regarding coding, such as classes or on the job training, or did you teach yourself? Would you like more classes about proper coding techniques or are they more of an annoyance than a help?
Statistics of polled scientists |
Various problems can arise from a lack of formal training while coding. Program breaking bugs may slow down the scientist more than a programmer. Even if the the programs works, the code may contain many small bugs that can alter the results in imperceptible ways, such as a research team out of Scrips Research Institute in California that had to redact 5 published papers because there was a flipped minus sign in the program that altered their output.
Many scientists verify their code using validation testing, where the scientist would input known values with a known output and compares that to what the program outputs. This method will often miss tiny mistakes that make themselves apparent in other data. Many programmers use more rigorous testing methods, breaking code in to small chucks and testing each chuck individually, then testing how the chucks are put back together.
Requiring verbose commenting is also a practice that many self taught scientists may ignore. A lack of comments makes understanding or changing code more difficult. A lack of comments also makes using code generated by someone else or someone else using your code confusing and nearly impractical. Encouraging scientists to form groups to share and talk about their programs can foster better communication between scientists and better code.
Seeing as the class is filled with majors other than computer science, I had a few questions about your experiences with coding. How often are you asked to program in your other classes or job? What kind if training have you received regarding coding, such as classes or on the job training, or did you teach yourself? Would you like more classes about proper coding techniques or are they more of an annoyance than a help?
Wednesday, February 16, 2011
Algorithms in Action: Genetic Algorithm
Artificial intelligence uses a great number of algorithms to perform various functions required by the AI. An algorithm is a structured set of logical instructions to reach some solution to a problem. One such algorithm is a genetic algorithm, a search heuristic that mimics heredity to explore various solutions based on the fitness, or correctness, of each element in a generation. Heuristics are experience based approaches that utilize previous knowledge for problem solving or learning, such as a rule of thumb or trail and error. BoxCar2D has been created to give a very good visual on how the genetic process works.
Genetic algorithms follow a flow not unlike biological genetics to simulate evolution of a system. An initial population is created, generally randomly, and each element of that population is set through a fitness test. After these tests are run, a percentage of the population that is deemed most fit is selected and allowed to "breed." This select division of the population mix portions of their code together to generate a new population of children that will hopefully improve overall fitness of the new population. Along with the mixing of the genetic code the algorithm can introduce mutations to a percentage of the next generation, randomizing different segments of the code to introduce new traits and to prevent an evolutionary dead end.
In the BoxCar2D simulation, the user can specify a few traits of the car and watch it evolve. The fitness testing of the car is based on how far the car can go and how fast. The best few are selected and pass on their traits to the next generation and the pattern continues. The user can affect the selected traits by up or down voting cars and changing the mutation rate, which can affect body shape, wheel size, or wheel placement. The graph that populates gives the user a look at how the generations do on average compared to the fitness threshold for it. Take it for a drive and see how it works.
Genetic algorithms follow a flow not unlike biological genetics to simulate evolution of a system. An initial population is created, generally randomly, and each element of that population is set through a fitness test. After these tests are run, a percentage of the population that is deemed most fit is selected and allowed to "breed." This select division of the population mix portions of their code together to generate a new population of children that will hopefully improve overall fitness of the new population. Along with the mixing of the genetic code the algorithm can introduce mutations to a percentage of the next generation, randomizing different segments of the code to introduce new traits and to prevent an evolutionary dead end.
BoxCar2D in action |
Tuesday, February 8, 2011
TED Talks: Birth of the Computer
The history of the computer goes well beyond the creation of basic electromechanical and electromagnetic computers developed in the early to mid 20th century. The science behind it was developed in the centuries preceding that, from relays to binary arithmetic. Historian George Dyson gave a TED talk about the long history of the computer along with a humorous look through some notebooks kept by early programmers.
Check out the video here.
Check out the video here.
Monday, February 7, 2011
Security: A New Frontier
In July of 2010, a new virus was discovered in Iran unlike any other the security world had ever seen. This virus, know as Stuxnet, was found on computers located within the Iranian nuclear power infrastructure. What amazed computer security experts was the sophistication and purpose of the virus. It was the first virus known to target and interfere with industrial infrastructure, specifically the nuclear plants in Iran. Symantec released a dossier detailing the methods used by the Stuxnet virus after reverse engineering the virus over a period of six months.
The initial infection of the networks inside the plant most likely was caused by an infected thumb drive as the internal network is isolated from the internet to maintain compartmentalization and prevent remote hacking attempts. From there, using a set of different vulnerabilities found on the computers in the network, it would spread searching for very specific computers, computers that had a program called Step 7. This software is used in the programming of programmable logic controllers, or PLCs, which interface between a computer and machinery. Stuxnet would remain hidden on the computer until it was connected through Step 7 to a variable-frequency drive, a machine that controls the oscillations of the centrifuges that enriched uranium. These drives need to oscillate at very specific frequencies, Stuxnet would slow and speed up these oscillations to damage or destroy the centrifuge and prevent the uranium from being enriched.
The Stuxnet virus is relatively harmless for normal computers, only seeking to infect new computers if the computer holds no relevance for its main purpose. The real threat is if Stuxnet becomes a blueprint for new generations of malware. Only time will tell if this is an anomaly in the security world or the beginning of a new era of cyber security.
The initial infection of the networks inside the plant most likely was caused by an infected thumb drive as the internal network is isolated from the internet to maintain compartmentalization and prevent remote hacking attempts. From there, using a set of different vulnerabilities found on the computers in the network, it would spread searching for very specific computers, computers that had a program called Step 7. This software is used in the programming of programmable logic controllers, or PLCs, which interface between a computer and machinery. Stuxnet would remain hidden on the computer until it was connected through Step 7 to a variable-frequency drive, a machine that controls the oscillations of the centrifuges that enriched uranium. These drives need to oscillate at very specific frequencies, Stuxnet would slow and speed up these oscillations to damage or destroy the centrifuge and prevent the uranium from being enriched.
The Stuxnet virus is relatively harmless for normal computers, only seeking to infect new computers if the computer holds no relevance for its main purpose. The real threat is if Stuxnet becomes a blueprint for new generations of malware. Only time will tell if this is an anomaly in the security world or the beginning of a new era of cyber security.
Thursday, February 3, 2011
Man vs. Machine: Computer to play on Jeopardy!
Starting on February 14th, Ken Jennings, Brad Rutter, and Watson will be playing a three day tournament with a million dollars on the line. Ken Jennings and Brad Rutter are the two top players of Jeopardy! fame, Jennings having the longest winning streak on the show and Rutter being the largest money winner after winning the Ultimate Tournament of Champions. But who is Watson? Not Sherlock Holmes' Dr. Watson, no, Watson is a supercomputer created by IBM.
Watson is an advanced artificial intelligence designed to answer questions in a very human, very natural context. Using a complex system of algorithms to process natural language, hold, search, and retrieve terabytes of information, Watson can determine the answer to almost any question in a few short seconds. Outfitted with a buzzer, Watson is set to challenge Jennings and Rutter to an intense mind vs machine match on Jeopardy!
Watson is an advanced artificial intelligence designed to answer questions in a very human, very natural context. Using a complex system of algorithms to process natural language, hold, search, and retrieve terabytes of information, Watson can determine the answer to almost any question in a few short seconds. Outfitted with a buzzer, Watson is set to challenge Jennings and Rutter to an intense mind vs machine match on Jeopardy!
Ken, Watson, and Brad during a practice match on January 13th, 2011. |
Self-Interview And Self Review
Today, I'll answer some general questions about ENIAC Beyond and its enigmatic creator.
This blog was created for Communicating Science with the intent to create an effective channel of information about computer technologies.What is the purpose of this blog?
Who is the imagined audience(s) of this blog?
I would like this blog to reach those interested in computer technology. This interest could range from the history of computers through the generations, future development of computer technologies, a specific field in computer science such as artificial intelligence or robotics, or just a simple desire to learn more about the machines that have become so integrated in our daily lives.
Have my posts matched up with my purpose/audience? What/who might I be overlooking in defining my purpose/audience this way?
So far, my posts have been focused on the history of computers and has not gone into any great detail into any one field. This does little for those who already know quite a bit about computers and wish to know more about coming developments and technologies.
What can I do to encourage more reader participation with my blog?
I could ask more questions to the reader. What do they think about this technology, or what would they like to come about from this research?
How can I expand my audience in this class? Outside of this class?
Networking inside the class will happen gradually as we work with each other. Since everyone in the class has created a blog for this class, associating a face with a blog will encourage each of us to read more and more of the blogs.
Outside of the class, connecting with other similar blogs would be a good idea. Creating a network could help get my blog out there to a whole world of readers.
How would I characterize the tone of my blog?
The overall tone of the blog so far has been educational with a few lighthearted comments thrown in.
What do I hope to get out of writing this blog?
I hope that constant use of the blog will enable me to communicate ideas more easily. Moving ideas from my head to paper has always been more of a struggle than it should be. Hopefully this will ease the process and let my ideas flow more easily.
A greater appreciation and understanding of computers around them.What would I like others to get out of it?
What are the strengths of my blog/my blogging?
Computer science is a field that I have been invested in for a long time. Many of its fields are very deep and I hope I can convey that sense to the readers.
What are the weaknesses?
I have difficulty placing the ideas that float around in my mind onto the page. Often times I'll worry if what I'm writing about is interesting to those reading it or if it comes off an inane ramblings of someone trying to fill pages.
Possibly, my first few posts did little to encourage discussion and create a dialogue.Have I used a deficit model in my writing, or something else? How would I know?
As a rapidly evolving frontier that has a lot of information and history behind it.How have I characterized (implicitly or explicitly) science, engineering, and/or technology in my blog?
So far, I haven't placed much personal tone on the posts so currently it is a bit stand offish, a lecturer perhaps.How have I characterized myself?
Self Review
After updating my blog, I would often go to some of the other blogs to see how they were doing, catching up on the posts and making comments. There, I'd see the level of activity that is going along with that blog. Some look about the same as mine while others look so prolific with their posting and comments that it makes me a little embarrassed at the progress of my own blog. But the point of this isn't to grade myself against other blogs, it is to see how well the blog is doing against the rubric, so let's take a look at some of the requirements.
Number and topic of posts
So far, I've managed to keep up with the minimum number of posts but not by much. I've generated 5 posts with actual content for the blog. not really a staggering amount. The posts themselves have followed the requirements of what should be posted each week, such as framing a post or relating it to what we read in the book. This fits into the C range of the rubric. Hopefully I'll be able to write more than the minimum as the class goes on.
Content of posts
My first content post was very long and full of information. I thought that was what every post should look like. I quickly moved away from that paradigm as it took me much too long to write so the proceeding posts were much shorter with more intent to engage the readers to post ideas or opinions about the topic. The posts are written coherently and does not talk down to the reader, it simply instructs. Grammar is not one of my stronger points but I try to keep the errors to a minimum. They pass the C requirement and possibly edge close to the B level.
Readers on my blog
I've had a few comments on my posts so far, each I've tried to answer each quickly and clearly. The Watson post was fairly popular because it had to do with something people could actually witness instead of being told about. Hopefully more comments will pop up quicker in the future as everyone gets used to the format of the class. Meeting the C requirement and maybe close to the B.
Comments on other blogs
Following other blogs is easy once you have a face with the name. I follow and post comments in Zach's Technology Complicated and Carlos' ScanMeIn because I sat and chatted with them the first few weeks of class. As the semester rolls along, hopefully I'll work with more people in the class and follow more blogs and post more comments. Around a C in total comments for other threads.
Misc
I try to link as much as possible to the articles and videos I reference in my blog along with wikipedia pages for concepts or doodads that people may want a deeper background with. Try to use pictures where appropriate but I definitely could use more for my posts. Keeping up with a C requirements.
Overall
It's been a rather slow start for the blog and keeping up with the requirements of the class is proving harder than I thought. I'm really hoping that as the semester goes on the blogging process becomes easier to do and I spend more time posting than worrying if something is interesting to the audience or not. I think the blog has meet the requirements for a C but that really isn't good enough. I need to kick it into a higher gear for the next evaluation and grow as a blogger.
Wednesday, January 26, 2011
Happy 90th, Robots
This year marks 90 years since the creation of the word robot. Engadget has a nice, short article about the entomology of the word.
Amazing Pace: Processing Power
"I went to see Professor Douglas Hartree, who had built the first differential analyzers in England and had more experience in using these very specialized computers than anyone else. He told me that, in his opinion, all the calculations that would ever be needed in this country could be done on the three digital computers which were then being built — one in Cambridge, one in Teddington, and one in Manchester. No one else, he said, would ever need machines of their own, or would be able to afford to buy them." -- Professor Douglas Hartree, Cambridge mathematician, 1951In 1943, design and construction of the Electronic Numerical Integrator And Computer, or ENIAC for short, began at the United States Army's Ballistic Research Laboratory. When the ENIAC was completed in 1946 it weighed over 27 tons and took up 680 square feet. The logic components of the computer consisted of 17468 vacuum tubes, 72000 diodes, 1500 relays, 70000 resistors, 10000 capacitors, and 5 million hand soldered joints. Input into the computer was done with switches and cards, some of the more complex programs requiring over a million cards. Each card would have to be input by hand into the machine, a process that could take weeks depending on the complexity of the program. The ENIAC was not the first computer built but it was one of the first to utilize a programmable, electronic architecture.
ENIAC |
"Where a calculator on the ENIAC is equipped with 18 000 vacuum tubes and weighs 30 tons, computers of the future may have only 1 000 vacuum tubes and perhaps weigh 1½ tons." -- Popular Mechanics, March 1949.The next huge leap in computer technology came in the late 50s. Bell Labs began work on using transistors to replace vacuum tubes as the main logic controller for the computer. Transistors are semiconductors which use electrical impulses to amplify and redirect the current. Compared to vacuum tubes, these transistors were incredibly lightweight, small, and inexpensive to produce. Each generation of transistor would see the size decrease. Eventually, the transistors were able to get small enough that instead of being stored in racks next to the computer, they were able to be placed on circuits and integrated into the hardware of the computer. The transistor count on these chips was directly related to the number of operations it could perform at any given time.
Eventually, thousands of transistors would be able to be placed on the circuit, leading to the development of the microprocessor in 1971. From here, the number of transistors on these microprocessors exploded. Following a trend first noticed by Intel co-founder Gordon Moore, the number of transistors that could inexpensively placed on a chip would double every two years, a trend known as Moore's Law. This notion has been incredibly accurate for the past few decades, the current number of transistors on microprocessors currently numbering in the billions.
Moore's Law |
Welcome to ENIAC Beyond
Whether we like it or not, computer technology has integrated deeply into our society. Over the course of a few decades, computers have become a very critical component of our everyday lives, from running basic necessities such as infrastructure for water and power or enabling communication quickly and over great distances.
This blog will focus on this ever growing part of our world: past, present, and future. A lot has been done over the past few decades and much more is on the horizon waiting to be applied to the lives we live. The information here will be over a wide range up subjects, from media to programming to robotics.
I can't wait to see what's in store.
This blog will focus on this ever growing part of our world: past, present, and future. A lot has been done over the past few decades and much more is on the horizon waiting to be applied to the lives we live. The information here will be over a wide range up subjects, from media to programming to robotics.
I can't wait to see what's in store.
Subscribe to:
Posts (Atom)