Multicore Processors in 2007
Introducing Multi-Core Technology
Multi-core processors represent a major evolution in computing technology.
Placing two or more powerful computing cores on a single processor opens up a world of important new possibilities. Learn about the evolution of multi-core technology and its advantages.
The Evolution of Multi-Core Technology
With the introduction of the first computer came market demands for more computing capacity. Symmetrical multi-processing (SMP) has long been a technology used to increase computing performance and efficiency by spreading computing loads across multiple processors. SMP is especially effective in multi-threaded environments where many tasks (threads) need to be handled simultaneously.
As application performance demands grow, processor designers are facing the issue that it takes more power to drive more computing capability. More power means that dissipation levels also need to be managed. Add to this the demands of the industry for computers to become smaller – more servers in a rack, thin and lighter laptops, and smaller footprint for a desktop system. Multi-core processing will help address these computing challenges. This evolution in technology will allow for increased performance and higher productivity in smaller computers that can simultaneously run multiple complex applications and successfully complete more tasks in a shorter amount of time.
The Multi-Core Advantage
In today's digital world the demands of complex 3D simulations, streaming media files, added levels of security, more sophisticated user interfaces, larger databases, and more on-line users are beginning to exceed single-core processor capabilities.
Multi-core processors enable true multitasking. On single-core systems, multitasking can max out CPU utilization, resulting in decreased performance as operations have to wait to be processed. On multi-core systems, since each core has its own cache, the operating system has sufficient resources to handle most compute intensive tasks in parallel.
Multi-core technology can improve system efficiency and application performance for computers running multiple applications at the same time.
Showing posts with label seminar topics. Show all posts
Showing posts with label seminar topics. Show all posts
7.12.2008
imperative programming
imperative programming
In computer science, imperative programming, as contrasted with declarative programming, is a programming paradigm that describes computation as statements that change a program state. In much the same way as the imperative mood in natural languages expresses commands to take action, imperative programs are a sequence of commands for the computer to perform.
Imperative programming languages stand in contrast to other types of languages, such as functional and logical programming languages. Functional programming languages, such as Haskell, are not a sequence of statements and have no global state as imperative languages do. Logical programming languages, like Prolog, are often thought of as defining "what" is to be computed, rather than "how" the computation is to take place, as an imperative programming language does.
Overview
The hardware implementation of almost all computers is imperative; nearly all computer hardware is designed to execute machine code, which is native to the computer, written in the imperative style. From this low-level perspective, the program state is defined by the contents of memory, and the statements are instructions in the native machine language of the computer. Higher-level imperative languages use variables and more complex statements, but still follow the same paradigm. Recipes and process checklists, while not computer programs, are also familiar concepts that are similar in style to imperative programming; each step is an instruction, and the physical world holds the state. Since the basic ideas of imperative programming are both conceptually familiar and directly embodied in the hardware, most computer languages are in the imperative style.
Assignment statements, in general, perform an operation on information located in memory and store the results in memory for later use. High-level imperative languages, in addition, permit the evaluation of complex expressions, which may consist of a combination of arithmetic operations and function evaluations, and the assignment of the resulting value to memory. Looping statements allow a sequence of statements to be executed multiple times. Loops can either execute the statements they contain a predefined number of times, or they can execute them repeatedly until some condition changes. Conditional branching statements allow a block of statements to be executed only if some condition is met. Otherwise, the statements are skipped and the execution sequence continues from the statement following the block. Unconditional branching statements allow the execution sequence to be transferred to some other part of the program. These include the jump, called "goto" in many languages, and the subprogram, or procedure, call.
History
The earliest imperative languages were the machine languages of the original computers. In these languages, instructions were very simple, which made hardware implementation easier, but hindered the creation of complex programs. FORTRAN, developed by John Backus at IBM starting in 1954, was the first major programming language to remove the obstacles presented by machine code in the creation of complex programs. FORTRAN was a compiled language that allowed named variables, complex expressions, subprograms, and many other features now common in imperative languages. The next two decades saw the development of a number of other major high-level imperative programming languages. In the late 1950s and 1960s, ALGOL was developed in order to allow mathematical algorithms to be more easily expressed, and even served as the operating system's target language for some computers. COBOL (1960) and BASIC (1964) were both attempts to make programming syntax look more like English. In the 1970s, Pascal was developed by Niklaus Wirth, and C was created by Dennis Ritchie while he was working at Bell Laboratories. Wirth went on to design Modula-2, Modula-3, and Oberon. For the needs of the United States Department of Defense, Jean Ichbiah and a team at Honeywell began designing Ada in 1974, a language which focuses on secure programming aspects, but did not complete the specification until 1983.
The 1980s saw a rapid growth in interest in object-oriented programming. These languages were imperative in style, but added features to support objects. The last two decades of the 20th century saw the development of a considerable number of such programming languages. Smalltalk-80, originally conceived by Alan Kay in 1969, was released in 1980 by the Xerox Palo Alto Research Center. Drawing from concepts in another object-oriented language — Simula (which is considered to be the world's first object-oriented programming language, developed in the late 1960s) — Bjarne Stroustrup designed [[C++]], an object-oriented language based on C. [[C++]] was first implemented in 1985. In the late 1980s and 1990s, the notable imperative languages drawing on object-oriented concepts were Perl, released by Larry Wall in 1987; Python, released by Guido van Rossum in 1990; PHP, released by Rasmus Lerdorf in 1994; and Java, first released by Sun Microsystems in 1994.
Example languages
The canonical examples of imperative programming languages are Fortran and Algol. Others include Pascal, C, and Ada.
Category:Imperative programming languages provides an exhaustive list.
In computer science, imperative programming, as contrasted with declarative programming, is a programming paradigm that describes computation as statements that change a program state. In much the same way as the imperative mood in natural languages expresses commands to take action, imperative programs are a sequence of commands for the computer to perform.
Imperative programming languages stand in contrast to other types of languages, such as functional and logical programming languages. Functional programming languages, such as Haskell, are not a sequence of statements and have no global state as imperative languages do. Logical programming languages, like Prolog, are often thought of as defining "what" is to be computed, rather than "how" the computation is to take place, as an imperative programming language does.
Overview
The hardware implementation of almost all computers is imperative; nearly all computer hardware is designed to execute machine code, which is native to the computer, written in the imperative style. From this low-level perspective, the program state is defined by the contents of memory, and the statements are instructions in the native machine language of the computer. Higher-level imperative languages use variables and more complex statements, but still follow the same paradigm. Recipes and process checklists, while not computer programs, are also familiar concepts that are similar in style to imperative programming; each step is an instruction, and the physical world holds the state. Since the basic ideas of imperative programming are both conceptually familiar and directly embodied in the hardware, most computer languages are in the imperative style.
Assignment statements, in general, perform an operation on information located in memory and store the results in memory for later use. High-level imperative languages, in addition, permit the evaluation of complex expressions, which may consist of a combination of arithmetic operations and function evaluations, and the assignment of the resulting value to memory. Looping statements allow a sequence of statements to be executed multiple times. Loops can either execute the statements they contain a predefined number of times, or they can execute them repeatedly until some condition changes. Conditional branching statements allow a block of statements to be executed only if some condition is met. Otherwise, the statements are skipped and the execution sequence continues from the statement following the block. Unconditional branching statements allow the execution sequence to be transferred to some other part of the program. These include the jump, called "goto" in many languages, and the subprogram, or procedure, call.
History
The earliest imperative languages were the machine languages of the original computers. In these languages, instructions were very simple, which made hardware implementation easier, but hindered the creation of complex programs. FORTRAN, developed by John Backus at IBM starting in 1954, was the first major programming language to remove the obstacles presented by machine code in the creation of complex programs. FORTRAN was a compiled language that allowed named variables, complex expressions, subprograms, and many other features now common in imperative languages. The next two decades saw the development of a number of other major high-level imperative programming languages. In the late 1950s and 1960s, ALGOL was developed in order to allow mathematical algorithms to be more easily expressed, and even served as the operating system's target language for some computers. COBOL (1960) and BASIC (1964) were both attempts to make programming syntax look more like English. In the 1970s, Pascal was developed by Niklaus Wirth, and C was created by Dennis Ritchie while he was working at Bell Laboratories. Wirth went on to design Modula-2, Modula-3, and Oberon. For the needs of the United States Department of Defense, Jean Ichbiah and a team at Honeywell began designing Ada in 1974, a language which focuses on secure programming aspects, but did not complete the specification until 1983.
The 1980s saw a rapid growth in interest in object-oriented programming. These languages were imperative in style, but added features to support objects. The last two decades of the 20th century saw the development of a considerable number of such programming languages. Smalltalk-80, originally conceived by Alan Kay in 1969, was released in 1980 by the Xerox Palo Alto Research Center. Drawing from concepts in another object-oriented language — Simula (which is considered to be the world's first object-oriented programming language, developed in the late 1960s) — Bjarne Stroustrup designed [[C++]], an object-oriented language based on C. [[C++]] was first implemented in 1985. In the late 1980s and 1990s, the notable imperative languages drawing on object-oriented concepts were Perl, released by Larry Wall in 1987; Python, released by Guido van Rossum in 1990; PHP, released by Rasmus Lerdorf in 1994; and Java, first released by Sun Microsystems in 1994.
Example languages
The canonical examples of imperative programming languages are Fortran and Algol. Others include Pascal, C, and Ada.
Category:Imperative programming languages provides an exhaustive list.
System Programming
system programming
Systems programming (or system programming) is the activity of programming system software. The primary distinctive characteristic of systems programming when compared to application programming is that application programming is to produce software which provides services to the user (e.g. word processor), whereas systems programming is to produce software which provides services to the computer hardware (e.g. disk defragmenter) requires a greater degree of hardware awareness. More specifically:
the programmer will make assumptions about the hardware and other properties of the system that the program runs on, and will often exploit those properties (for example by using an algorithm that is known to be efficient when used with specific hardware)
usually a low-level programming language or programming language dialect is used that:
can operate in resource-constrained environments
is very efficient and has little runtime overhead
has a small runtime library
Systems programming (or system programming) is the activity of programming system software. The primary distinctive characteristic of systems programming when compared to application programming is that application programming is to produce software which provides services to the user (e.g. word processor), whereas systems programming is to produce software which provides services to the computer hardware (e.g. disk defragmenter) requires a greater degree of hardware awareness. More specifically:
the programmer will make assumptions about the hardware and other properties of the system that the program runs on, and will often exploit those properties (for example by using an algorithm that is known to be efficient when used with specific hardware)
usually a low-level programming language or programming language dialect is used that:
can operate in resource-constrained environments
is very efficient and has little runtime overhead
has a small runtime library
Hacking
What Is a Hacker?The Jargon File contains a bunch of definitions of the term `hacker', most having to do with technical adeptness and a delight in solving problems and overcoming limits. If you want to know how to become a hacker, though, only two are really relevant.There is a community, a shared culture, of expert programmers and networking wizards that traces its history back through decades to the first time-sharing minicomputers and the earliest ARPAnet experiments. The members of this culture originated the term `hacker'. Hackers built the Internet. Hackers made the Unix operating system what it is today. Hackers run Usenet. Hackers make the World Wide Web work. If you are part of this culture, if you have contributed to it and other people in it know who you are and call you a hacker, you're a hacker.The hacker mind-set is not confined to this software-hacker culture. There are people who apply the hacker attitude to other things, like electronics or music -- actually, you can find it at the highest levels of any science or art. Software hackers recognize these kindred spirits elsewhere and may call them "hackers" too -- and some claim that the hacker nature is really independent of the particular medium the hacker works in. But in the rest of this document we will focus on the skills and attitudes of software hackers, and the traditions of the shared culture that originated the term `hacker'.There is another group of people who loudly call themselves hackers, but aren't. These are people (mainly adolescent males) who get a kick out of breaking into computers and phreaking the phone system. Real hackers call these people `crackers' and want nothing to do with them. Real hackers mostly think crackers are lazy, irresponsible, and not very bright, and object that being able to break security doesn't make you a hacker any more than being able to hotwire cars makes you an automotive engineer. Unfortunately, many journalists and writers have been fooled into using the word `hacker' to describe crackers; this irritates real hackers no end.The basic difference is this: hackers build things, crackers break them.If you want to be a hacker, keep reading. If you want to be a cracker, go read the alt.2600 newsgroup and get ready to do five to ten in the slammer after finding out you aren't as smart as you think you are. And that's all I'm going to say about crackers.The Hacker AttitudeHackers solve problems and build things, and they believe in freedom and voluntary mutual help. To be accepted as a hacker, you have to behave as though you have this kind of attitude yourself. And to behave as though you have the attitude, you have to really believe the attitude.But if you think of cultivating hacker attitudes as just a way to gain acceptance in the culture, you'll miss the point. Becoming the kind of person who believes these things is important for you -- for helping you learn and keeping you motivated. As with all creative arts, the most effective way to become a master is to imitate the mind-set of masters -- not just intellectually but emotionally as well.Or, as the following modern Zen poem has it:
Bluetooth Networking
Bluetooth™ wireless technology is taking handheld computing devices to the next level – wireless networking anytime anyplace. With Bluetooth enabled handhelds users will be able to easily connect to the Internet and gains access to their email (whether it is POP3 or behind the firewall corporate email), browse the web, and stay connected with instant messaging. And users will be able to connect to their corporate LANs gaining access to the intranet, corporate databases, database synchronization, and desktop synchronization for PIM data and documents.Bluetooth wireless technology will power new usage models for the handheld giving the users unprecedented flexibility and connectivity options for truly anytime anyplace networking such as:• Handheld-to-handheld collaboration for personal area networking (WPAN)• Handheld-to-LAN access point connectivity for local area networking (WLAN)• Handheld-to-cellular phone communications for wide area networking (WWAN)Only Bluetooth wireless technology on the handheld can provide all of these networking options. And it is due to the inherent advantages of Bluetooth. First and foremost, because of the frequency hopping nature of Bluetooth networking, there is no frequency planning required allowing a user to network anytime anyplace. Next comes into play is form factor. Anytime anyplace networking only works when the handheld is easy to carry around. Bluetooth chip sets are small enough to fit on SD and CF cards, and are already embedded in cell phones and some handhelds.
Bluetooth™ wireless technology is taking handheld computing devices to the next level – wireless networking anytime anyplace. With Bluetooth enabled handhelds users will be able to easily connect to the Internet and gains access to their email (whether it is POP3 or behind the firewall corporate email), browse the web, and stay connected with instant messaging. And users will be able to connect to their corporate LANs gaining access to the intranet, corporate databases, database synchronization, and desktop synchronization for PIM data and documents.Bluetooth wireless technology will power new usage models for the handheld giving the users unprecedented flexibility and connectivity options for truly anytime anyplace networking such as:• Handheld-to-handheld collaboration for personal area networking (WPAN)• Handheld-to-LAN access point connectivity for local area networking (WLAN)• Handheld-to-cellular phone communications for wide area networking (WWAN)Only Bluetooth wireless technology on the handheld can provide all of these networking options. And it is due to the inherent advantages of Bluetooth. First and foremost, because of the frequency hopping nature of Bluetooth networking, there is no frequency planning required allowing a user to network anytime anyplace. Next comes into play is form factor. Anytime anyplace networking only works when the handheld is easy to carry around. Bluetooth chip sets are small enough to fit on SD and CF cards, and are already embedded in cell phones and some handhelds.
Linux vs Windows
Linux vs. Windows We are often asked which hosting platform one should choose, Linux or Windows®2003. Below, we've listed the major differences between the two that you should consider in making your final decision. In the process, we have also tried to dispel any common misconceptions regarding these platforms.
Important: One of the biggest misconceptions is that you have to run Linux on your own computer in order for you to host your site on Linux platform. That is not true. Which operating system you run on your computer is irrelevant in making your decision. This means that if you are running Windows98 or Windows®2003 on your computer, you can still choose Linux.
Platform Ease of Use Reliability Speed Functionality Price Integrated with both Solutions Free / Open Source Software
Ease of Use Over the years, many resources have been poured into both Linux and Windows®2003 platforms to make hosting as easy as possible. Today, there are no differences in terms of ease of use between these platforms. If you are an advanced user who prefers to work in shell environment, then Linux is your choice. However, if you are a beginner, you will find both platforms user friendly, especially with a help of Hostway's control panel, SiteControl.
Reliability The industry consensus is that Linux is more reliable. However, Hostway has made sure that the reliability of the Windows®2003 platform closely matches that of Linux's. What this means is that you can make your decision based on the differences in features rather than on reliability.
Speed There is no difference in terms of speed between the two platforms.
Functionality This is where two platforms differ the most. There isn't much that can be achieved in one platform that can't be on the other one. The main difference is how the end is achieved. For example, if you need a database driven Web site, you can choose either PHP/MySQL combination under Linux or ASP/MS SQL combination under Windows®2003. The trend is for solutions to be supported in both platforms. For example, Microsoft FrontPage, one of the most popular Web site editors out there, is supported on both Linux and Windows®2003. Please see below for the list of supported features.
Price Here Linux holds a slight edge due to the fact that the most of the software licenses are free and that it's easier to maintain Linux servers than Windows®2003 servers for the same level of reliability and performance. You will see that our hosting prices reflect this. Unless you specifically plan to use Windows®2003 platform specific features (see below), it probably makes sense to choose the Linux platform.
Linux vs. Windows We are often asked which hosting platform one should choose, Linux or Windows®2003. Below, we've listed the major differences between the two that you should consider in making your final decision. In the process, we have also tried to dispel any common misconceptions regarding these platforms.
Important: One of the biggest misconceptions is that you have to run Linux on your own computer in order for you to host your site on Linux platform. That is not true. Which operating system you run on your computer is irrelevant in making your decision. This means that if you are running Windows98 or Windows®2003 on your computer, you can still choose Linux.
Platform Ease of Use Reliability Speed Functionality Price Integrated with both Solutions Free / Open Source Software
Ease of Use Over the years, many resources have been poured into both Linux and Windows®2003 platforms to make hosting as easy as possible. Today, there are no differences in terms of ease of use between these platforms. If you are an advanced user who prefers to work in shell environment, then Linux is your choice. However, if you are a beginner, you will find both platforms user friendly, especially with a help of Hostway's control panel, SiteControl.
Reliability The industry consensus is that Linux is more reliable. However, Hostway has made sure that the reliability of the Windows®2003 platform closely matches that of Linux's. What this means is that you can make your decision based on the differences in features rather than on reliability.
Speed There is no difference in terms of speed between the two platforms.
Functionality This is where two platforms differ the most. There isn't much that can be achieved in one platform that can't be on the other one. The main difference is how the end is achieved. For example, if you need a database driven Web site, you can choose either PHP/MySQL combination under Linux or ASP/MS SQL combination under Windows®2003. The trend is for solutions to be supported in both platforms. For example, Microsoft FrontPage, one of the most popular Web site editors out there, is supported on both Linux and Windows®2003. Please see below for the list of supported features.
Price Here Linux holds a slight edge due to the fact that the most of the software licenses are free and that it's easier to maintain Linux servers than Windows®2003 servers for the same level of reliability and performance. You will see that our hosting prices reflect this. Unless you specifically plan to use Windows®2003 platform specific features (see below), it probably makes sense to choose the Linux platform.