Martes, Oktubre 25, 2011

Information technology

            IT is the area of managing technology and spans wide variety of areas that include but are not limited to things such as processes, computer software, information systems, computer hardware, programming languages, and data constructs. In short, anything that renders data, information or perceived knowledge in any visual format whatsoever, via any multimedia distribution mechanism, is considered part of the IT domain. IT provides businesses with four sets of core services to help execute the business strategy: business process automation, providing information, connecting with customers, and productivity tools.

           IT professionals perform a variety of functions (IT Disciplines/Competencies) that ranges from installing applications to designing complex computer networks and information databases. A few of the duties that IT professionals perform may include data management, networking, engineering computer hardware, database and software design, as well as management and administration of entire systems. Information technology is starting to spread further than the conventional personal computer and network technologies, and more into integrations of other technologies such as the use of cell phones, televisions, automobiles, and more, which is increasing the demand for such jobs.
             In the recent past, the Accreditation Board for Engineering and Technology and the Association for Computing Machinery have collaborated to form accreditation and curriculum standards for degrees in Information Technology as a distinct field of study as compared to Computer Science and Information Systems today. SIGITE (Special Interest Group for IT Education)  is the ACM working group for defining these standards. The Worldwide IT services revenue totaled $763 billion in 2009.

New Computer Inventions


              New inventions are everyday occurrences in the computer world. As a matter of fact, a month ago, Stealth Ideas Inc., introduced the StealthSurfer II ID Protect. It is a miniature flash drive that lets you surf the Internet anonymously from your computer using an encrypted mode. It comes with memory of 256 megabytes to two-gigabytes and prices start at $99. For those who dabble in the arts, the DigiMemo-692 Digital Notepad enables you to record sketches with ink and paper and then syncronize your notes into your PC using any handwriting reorganization software.
            Handheld devices such as PDA or Pocket PC is best for note sketching and it can be synchronized into your PC as a digital copy. The problem of the handheld is its screen input limitations and the screen protector needs to be replaced frequently. DigiMemo 692 Digital Notepad is a breakthrough to all these limitations which is able to 'record' your sketches with ink and ordinary paper. You can easily synchronize your notes into your PC and it is compatible with most of the handwriting reorganization software. This handy device will record your notes, ideas and sketches without the need to scan, and it comes with ink cartridge, a digital pen and software. It also comes with a USB cable that is compatible with WIN 2000 and XP.
             The Microsoft Xbox 360 is one of the latest inventions of video games. Video game lovers everywhere are rejoicing the arrival of the new system. The newest Xbox is still a gamer's dream, but it is also being marketed as a media center. Not only can the system be used for playing video games, but it can also be used to play DVDs, CDs and MP3s. Digital cameras can be hooked up to it, as well as mp3 players and even personal computers.
             Patenting computer and related inventions come under the intellectual property rights of different countries, where the ownership and copy rights are reserved to the person or organization who invented the product. But in the United States, computer hardware or software invention are deemed patentable only if the invention is vital to a particular task or process. In short, patents are not given to software or even hardware, that are extensions to existing computer technology or if it's just a method of doing business.
           One of the new computer inventions is patented by IBM and this is a tiny hard drive the size of a fifty cent piece. This small chip like thing can store up to 340 MB of data and will be very useful in mobile devices, digital cameras, music players, etc.
           There are a lot of companies like the IBM and Microsoft that have full fledged research teams working full time on computer inventions.


History of Computers

            Until recently, it was common practice to divide the history of computers by period or
generations, depending on the technology used to build them. The rapid advances in current technologies,
however, made this impractical. Still, a survey of computer history by generations will be very
educational.
Zeroth Generation
            Authors generally disagree on which point in time a narrative of the history of computers should
begin. As early as 3000 BC, the Chinese is said to have used an early form of the bead-and-wire abacus.
In 876 BC, the use of the symbol for zero was first recorded in India (PCWS, 2000: 39). It is said that the
basic idea of computing developed in the 1200s when a Moslem cleric proposed solving problems with a
series of written procedures (Golden Ink, 2000).
          Edmund Gunter of England invented the slide rule, precursor of the calculator, in 1620. Blaise
Pascal invented the first mechanical calculator in 1642 using gears with teeth to store digits. When a gear
rotated past the tooth representing the digit 9, the next gear to the left shifted one tooth, or digit (Mandell,
1979: 22). The calculator can perform the work of six accountants, but the initial public reaction was not
overwhelming (PCWS, 2000: 39).
          The German mathematician Gottfried Wilhelm  von Leibnitz unveiled his invention, a fourfunction calculator that can also calculate square roots, in 1694. He built upon the ideas of Pascal and
used the binary representation of numbers. “It is unworthy of excellent men to lose hours like slaves in
the labor of calculation which could safely be relegated to anyone else if machines were used,” says
Liebnitz.
          In 1801 a Frenchman, Joseph-Marie Jacquard  builds a loom that weaves by reading holes
punched on small sheets of hardwood. These plates are inserted into the loom, which reads the pattern and
creates the weave. Powered by water, this machine came 140 years before the development of the modern
computer (Golden Ink, 2000), which would use the Jacquard concept that data could be stored in punched
cards.
         In 1822, Charles Babbage designed the difference engine, a machine that can execute complex
computations and print results without human intervention. It was originally intended to automatically
calculate mathematical tables, which at that time were being made manually, thereby resulting in many
errors and inaccuracies. The difference engine used  the method of finite differences and iterations to
compute complicated functions. It can compute mathematical tables with results of up to five significant
digits. However, Babbage could not  complete his project when he tried to build a larger model. The
components he needed to meet tolerance for accuracy could not be produced (Mandell, 1979: 22). The
difference engine, however, was sufficiently developed by 1842 that Ada Lovelace used it to
mechanically translate a short written work. She is  generally regarded as the first programmer (Golden
Ink, 2000).
           In 1832, Babbage’s brilliant mind came up with another project, the first computer driven by
external instructions (PCWS, 2000: 40) called the analytical engine. It could do the four fundamental
mathematical operations,  i.e. addition, subtraction, multiplication, and division, and store immediate
results in a memory unit. It was to have an arithmetic unit to perform computations, a punched-card input
system, and an external storage unit (Mandell, 1979: 24). Due to lack of funding, however, and possibly
because Babbage was way ahead of his time, he never built the steam-driven analytical engine.
 In 1854, Cork University mathematics professor George Boole published his thoughts on
symbolic logic with his An Investigation of the Laws of Thought. Decades later, this will form the basis of
computer science applications, and hence Boole is generally recognized today as the father of computer
science. Sir Charles Wheatstone in 1857 introduced the continuously feeding paper tape that can be used
for storing and reading data.
             The US Bureau of Census commissioned Herman Hollerith of the Massachusetts Institute of
Technology to build a tabulating machine in the 1880s. Hollerith showed that by putting census data on
punched cards and having a machine that will do the sorting, 250 cards could be sorted in one minute,
greatly reducing census tabulation time from ten years to two-and-a-half. In 1896, perhaps emboldened by
the success of his machine in the 1890 census Hollerith founded the Tabulating Machine Company to
commercially manufacture and market punched card equipment. 15 years later, the Tabulating Machine
Company would merge with others to form the Computing Tabulating Recording Company, renamed the
International Business Machines or IBM in 1924.
             Dorr E. Felt introduced the depressable keys  in 1885 with his four-function calculator, the
Comptometer.
            In 1897, Karl Braun develops the cathode ray tube (PCWS, 2000: 41), which would become a
major factor in the manufacture of computer monitors or video displays.
            Konrad Zuse, a German construction engineer, built a mechanical calculator in 1935. Shortly
after completion, Zuse started work on a programmable electronic device called the Z-1. It was the first to
use binary codes, completed in 1938.
          At this point, claims and counterclaims of having first created what would blur any narrative.
While in one account the Americans would claim to be the first to develop this type of computer, another
narrative would show that the Germans had it first. The British also had their share of story. The
following narrative, thus, would take facts as told by various sources, without purposely asserting any
claims. Besides, there was no global patenting scheme during those times yet, making it easy to ascertain
which really came first.
           As a concrete example, John Atanasoff of Iowa State University, aided by graduate student
Clifford Berry, began a pioneering work in 1937 on the first digital electronic computer which years later
will become the basis of the world’s first general-purpose electronic computer. The Atanasoff-Berry
Computer or ‘ABC’ used vacuum tubes instead of electrical relays to carry out computations, solving
linear equations common in physics. Atanassof did not take a patent for his work, expecting the
University lawyers to do it. However, the administration did not take interest in his work, which was
cannibalised by undergraduate students. His work would be archived until 10 years later. In the same
year, Georges Stibitz developed the first binary circuit at the AT&T Bell Laboratories.
             Yet in another account, a Russian immigrant named Vladimir Zworykin invented the CRT in 1928 (HCS, 1996).
           In 1941, Zuse developed the Z-3, which was based on relays and was very sophisticated for its
time. It utilized the binary number system and could handle floating-point arithmetic. Zuse had
considered employing vacuum tubes but chose relays instead because they were more readily available. In
1943, he started work on a general-purpose relay computer called the Z-4, which would survive the
bombs of World War II hidden in a cave in the Bavarian Alps. It will be up and running in a Swiss bank
by 1950 (Maxfield, 1998).
         In 1944, Harvard University professor Howard Aiken completed the Harvard Mark I in
conjunction with IBM engineers. Harvard Mark I, officially known as the Automatic Sequence Controlled
Calculator or ASCC, was based on standard IBM electromechanical parts. It used electromagnetic relays,
switches, mechanical counters, shafts, and clutches. Described as sounding  like a roomful of ladies
knitting, the machine contained more than 750 000 components, was 50 feet long, 8 feet tall, and weighed
approximately 5 tons. It could add or subtract 23-digit numbers in three-tenths of a second, multiply them
in four seconds, and divide them in ten seconds (Maxfield, 1998). It provided vital calculations to the
Navy in World War II, and until 1953 would be used to calculate artillery and bombing trajectory tables.
First Generation
             Characterized by mechanical and electromechanical technologies, zeroth generation computers
are slow and unreliable. The inertia of moving parts limits computing speed, while gears and levers make
information processing cumbersome. The birth of the  electronic age led to the abandonment of gears,
levers, counter wheels, and the like in favor of the new vacuum tube technology.
 The first generation computers used the vacuum tube as primary component. These computers
were quite large, requiring considerable space. Moreover, the vacuum tubes were expensive. They
frequently break down, making costs higher. These tubes also produce enormous amounts of heat.
Airconditioning is generally required.
              First generation computers had relatively small main memories, usually in terms of a few
thousands of characters. Processing speed is at the millisecond level, while the IO media used are
punched cards and paper tapes.
            John Mauchly and John Presper Eckert of the  University of Pennsylvania Moore School of
Engineering developed the  Electronic  Numerical  Integrator  And  Calculator or ENIAC in 1946. It had
thirty separate units, plus power supply and forced air-cooling. It weighed over thirty tons, and utilized 19
000 vacuum tubes, 1 500 relays, 70 000 resistors, 10 000 capacitors, and 500 000 solder joints, including
thousands of inductors, consuming almost 200 kilowatts of electrical power (). The ENIAC was originally
part of the war effort, designed to compute ballistic trajectories very quickly, boasting 5 000 additions or
subtractions per second. However, it came too late for the war and was used instead to solve equations to
prove the theory behind the atom bomb, which it did in 2 hours as contrasted to the estimated 100 manyears of conventional computation required to do  the same task. A 10-digit multiplication would take
Harvard Mark I around 3 seconds to perform, while ENIAC could do it in 3 milliseconds (Del Rosario,
1996: 2).
            In 1948, the Small-Scale Experimental Machine or SSEM, also known as the Manchester Mark I,
made its first successful run of a program. Created by Frederic Callan Williams and Tom Kilburn at the
University of Manchester, it was the first machine that had all the components now classically regarded as
characteristic of the basic computer. It had a a RAM of just 32 locations or words, each word consisting of
32 bits. Thus the machine had a grand total of 1024 bits of memory.
 The EDSAC or  Electronic  Delay  Storage  Automatic  Computer was completed at Cambridge
University led by Maurice Wilkes in 1949. It was the first operational stored-program computer. Instead
of wired control panels, instructions stored in the computer itself controlled the operation of the machine
(Mandell, 1979: 27). The EDSAC introduced the concept of memory hierarchies, having a primary
memory of 1024 words and a drum of 4600 words. The EDVAC or  Electronic  Discrete  VAriable
Computer was completed in 1951 by John von Neumann, credited with the introduction of the stored
program concept, wherein data and instructions reside in the same storage. Actually, EDVAC was started
ahead of the EDSAC, but the former has been crippled with the departure of some of the people working
on the project to form their own enterprise.
 The first commercial electronic computer became available in 1951, developed by Eckert and
Mauchly. It was known as the UNIVAC I or the Universal Automatic Computer. The UNIVAC brought
more attention and interest to the computer when it correctly predicted the results of the US presidential
election in favor of Dwight D. Eisenhower. Opinion polls predicted a landslide victory for Adlai
Stevenson.
 In 1952, the MANIAC at was built at Los Alamos Scientific Laboratory, along with the ILLIAC
at the University of Illinois, the JOHNIAC at Rand Corporation, the SILLIAC in Australia, and others.
John von Neumann and his colleagues completed the IAS bit-parallel machine in 1952 for the Institute of
Advanced Studies at Princeton University, New Jersey. It became the prototype for the modern computer,
incorporating the five functional units of input, output, memory, ALU, and control. It introduced random
access main memory, and implemented the fetch-decode-execute instruction cycle. The IAS machine
architecture would later become an industry standard and named the Von Neumann architecture.
Second Generation
 The shift from the use of vacuum tubes to  discrete transistors and diodes marked the second
generation, generally regarded as the era when the  computer industry was born. The invention of the
transistor, according to the American Institute  of Physics, came on 17 November 1947, when Bell
Laboratories scientist Walter Brattain dumped his whole  experiment into a thermos of water. Brattain
wanted to get rid of the condensation forming on the silicon contraption he built, and although putting the
silicon in a vacuum would have been desired, he knew it would take too much time. He dumped it instead
in water, and was surprised to observe the largest amplification he had so far seen. The rest of the story is
history.
               The transistor is smaller than the vacuum tube, costs less and is more reliable. Power
consumption is smaller, and thus heat dissipation is likewise less. During this period, the magnetic tape
emerged as the major IO medium, main memory capacities are less than a hundred kilobytes, and
computing speed goes to the microsecond level.
               In January 1954, engineers from the Bell Labs built the first computer without vacuum tubes.
Called the TRADIC or  TRAnsistorized  DIgital  Computer, the machine was a mere three cubic feet, dwarfed by the 1 000-square-foot ENIAC. TRADIC had around 800 point-contact transistors and 10 000
germanium crystal rectifiers.  It could do a million logical operations every second, and operated on less
than 100 watts of power. TRADIC was 20 times faster than vacuum tube computers.
 A lot of innovations in the use of computers marked the second generation. High-level languages
were developed, such as the FORmula TRANslator (1954-57), the ALGOL (1958), COBOL (1959), and
the PL/1 (1963). Moreover, operating systems were also introduced during this period, but were initially
limited to providing services required for batch processing. The use of separate IO processors and the
development of compilers led to improvements such as having the OS handle all IO, the introduction of
utilities, and elaborate loaders.
              Third Generation
             Fairchild Semiconductor filed a patent application for the planar process of manufacturing
transistors in 1959. Fairchild’s Robert Noyce constructed an integrated circuit with components
connected by aluminum lines on a silicon-oxide surface layer. Two years later, Fairchild released its first
commercial integrated circuit.
             In 1958, Jack Kilby of Texas Instruments, working independently, developed the idea of creating
a monolithic device on a single piece of silicon. His first integrated circuit contained five components on
a piece of germanium half an inch long and thinner than a toothpick (Polsson, 2000).
 The introduction of the integrated circuit, which is composed numerous transistors manufactured
as a single unit, ushered in the third generation. The third generation computer is characterized by SSI and
MSI technologies. SSI or small-scale integration allows one to ten logic gates packed in a single IC, or
around 50 to 100 transistors in one package, while MSI or medium-scale integration allows up to a
hundred gates in one IC, or around a thousand transistors in a single package. The IC offered increased
reliability, faster processing capability, and advantages of smaller size and less power dissipation.
 Tom Watson Jr’s 5,000,000,000-dollar gamble, as  dubbed by Fortune magazine, was the most
influential third generation computer. The IBM System 360,a family of computers with six upward
compatible models, was introduced in 1964.  Customers can choose from 5 processors and 19
combinations of power, speed, and memory. It boasts of up to one million words of memory and a new
32-bit CPU with 16 general registers, 4 double length registers for floating point operations, and an
extensive instruction set. By the end of the decade, S/360 computers are used all over the world.
 The S/360 was a success, and the industry reaction was varied. Some tried to clone the S/360 and
sell at lower price, while others capitalize on the fact that the S/360 was not compatible with previous
IBM machines. Products were developed to attract IBM customers who wanted to avoid the difficulties of
conversion and porting to the new S/360. Still others tried to develop entirely different lines of machines
but which were more powerful and cost-effective than the S/360. However, not all vendors survived.
 Software developments in this generation included the support of multi-programming, timesharing, and virtual memory.
              Fourth Generation
              In around 1970, advances in semiconductor technology saw an increase in gate densities of up to
several thousands per IC. Termed as the LSI or  large-scale integration, it has made possible the
fabrication of a CPU, or even an entire computer, on a single IC. Today’s ¼-square-inch Pentium III
microprocessor, for example, is more powerful than the room-sized ENIAC.
 The fourth generation is characterized by LSI and VLSI IC technologies. VLSI stands for very
large-scale integration, which allows the packing of millions of transistors in a single IC. This period is
marked by an increased complexity and miniaturization of the IC, rapid increases in processing capacity
and speed and the inversely proportional cost of information processing, easier to use software, an
astounding increase in the software and hardware sales volume, and an insatiable demand for qualified
computing personnel.
               In 1971, Intel Corporation unveiled the first commercially available microprocessor, the /6” x 1/8” 4004. The 4-bit processor contained 2 200 transistors, had 45 instructions, and was almost as powerful as
the ENIAC. In the same year, Intel released the 8-bit 8008. Its variant, the 8080A was released in 1973
and became an industry standard. Other vendors came up with their own 8-bit microprocessors, such as
the Zilog Z80, developed by former Intel engineer Masatoshi Shima, the Motorola 6800, and Intel’s own
8085.
                 The microcomputer revolution began with the  introduction of the personal computer in 1975.
MITS or Model Instrumentation Telemetry Systems, a small struggling company in Albuquerque, New
Mexico, developed the Altair 8800 computer, the world’s first microcomputer based on the Intel 8080
processor. Named after one of the stars in the popular TV series Star Trek, the Altair 8800 was sold by
mail order for US$397. MITS proprietor Ed Roberts was able to negotiate with Intel to sell him the 8080
processor for US$75 instead of the going price of US$36. Overnight, the MITS financial position
turned around. It must be noted that the Altair 8800 was sold in kit form and unassembled. It would only
have 256 characters of memory, no keyboard, no monitor, and programs are written in machine language
(Delaney, 1995).
               The first 16-bit microprocessor was developed  by National Semiconductors in 1975, called the
PACE. In 1985, the same company introduced the first full 32-bit microprocessor, the NS32032, which
had the speed and power of a mini. Likewise, Intel packed over 275 000 transistors in its 16-bit 80386
microprocessor, which could execute three to four million instructions per second. Four years later, Intel
unveiled the 80486, which contained 1 200 000 transistors and could execute 20 million instructions per
second. Digital Electronics Corporation introduced the first 64-bit processor in 1993, the DEC Alpha. The
64-bit Intel Pentium processor, released in the same year, had 3 100 000 transistors and was five times
faster than its predecessor, the 80486. It could execute 100 million instructions per second.
In 1995, the state-of-the-art VLSI technology was the 0.35 micron feature size. It means that
around 4 million transistors can be packed on a single IC. In 1998, it was 0.25 micron or 7 million
transistors in a single IC.
           

Capabilities, Characteristics, Limitations of computers

   
  The computers has its own characteristic,capabilities and limitations. Characteristics was very important to an individual just like us but did you know that computer has its own characteristic.The characteristic of the        
computers are the speed,accuracy,efficiency,versatility,and storage. Computer has also have capabilities and one of the example of this capabilities of this are accuracy,store and retrieve ,speed, arithmetic functions.
       If the computer has a characteristic and capabilities it also has a limitations. And one of the limitations is: it can not function without power,it can not operate with itself, Dependence on instructions , inability to give meaning to the object, and dependent in electric power. Computer is just like a people why......? Because there are characteristic of the people that computer has also have like the speed. 

"COMPONENTS OF COMPUTER"

 
     Computer have its own components.
The hardware it is the physical components of the computer. Data and process,software this components used to set instructions developed by a computer programmer software also known as program. The software have 2 categories first is the operating system software it is a set of the programs that control how the computer hardware and the software will work together and the second is disk operating system which operates the intended to run on any intell.
      People ware the people ware has also have two types the end user and the computer programming.
because of this components computer are improved more.....!  

"Types of Computer"

     Computers comes in defferent types, shapes,and sizes.It becomes smaller and easy to handles as the year pass by. One of the example of this type are the mainframe computers this is large computers and very powerful because its has a many processor working at the working time, performing several tasks. Minicomputers this computers a bit smaller than mainframe computers but bigger and more powerful than microcomputers.
      Microcomputers are small in size and affordable in price compared to mainframe and minicomputers.
Desktop are microcomputers that's are small enough to be place on flat surface. Loptop is the most popular microcomputers because of its small size but also as powerful as a desktop. And the others are personal digital and assitants...

Parts of the computers and its functions

             We also known that the computers is a system because it is made up of parts where each has an assigned function to do get it works together as one together with the others parts. One of the basic example of this part are the monitor this part displays the informations such as texts, numbers, and graphics. Keyboard the keyboard used to enter the data to the computer by typing or pressing its key. Mouse this part use to enter the data into the computer by clicking its buttons. Printer use to produces the print-out of the data done in the computer.
            And the last is the speaker; speaker use to produces the sounds or music from the computer. This parts    
are have a major rule in the computers did not function. But aside from these parts, there are still other devices that can be attached to the computer to enchance more its capabilities.....!

"COMPUTERS"

Now a days there are many teachnology that are developed and one of the example of this is the computer.
What is computer..? that's the questions that comes to our mind.As what I have learned in CSBC01 computer is
a programmble  machine that's accepts input process it and produces an output.Computer is very useful why..?
because computer helps us in many ways like in the field of educations, medicine,communications and in the business. In the field of the educations computer helps us to reaserch computer also help our instructor to aided
instructions,discussions to make better learnings to the students. In the field of medicine computer give the most important rule because now a days many doctors in the hospitals use computers to help them dianose the illness of the patients. In communications computers helps by sending and receiving emails and one of the example of this is sending a blog.......!
            The persons who make a program was called a computer programmer.....!

Lunes, Oktubre 24, 2011

Knowledge I Gathered in CSCBC01

HELLO......
 Its October 6, we meet again.....!
          That day our instructor discuss about the Microsoft Excel.   Microsoft Excel can perform calculation,
analize information ,and visualize data in spreadsheet by using  this Microsoft. We make a activity using that    
Microsoft. Our instructor teach us on how to make formula. In this  I learn to make a formula and the parts of the Microsoft Excel for example  the "name box" , "formula box" in this box you can see the word that type in a cell. The cell is the box in a working area that you write the word for your activities.
     This Microsoft Excel can help us because  we will be able to make our daily activity  through this Microsoft  Excel.Its so very difficult to learn this lesson but with the help of our instructor it can make easy to us learn that lesson. As what our instructor said that "Grade is just only a number the imfortant is the lesson you have learned...! that's all thank you God bless us every one.....! 

My learning in CSBC01

Hi....... Madlang people its another na naman.......
 September 29 its  the deadline of our activity in CSBC01. But where not done of the activitiy so we make it faster but we have no time to make it.So our instracto Mr  Beria give us a time to make it he told us after he discuss he will give us a time to done with our activity. He discuss about how to make power point he also give the step. He example of power point he also give on how to design a power point how to insert picture in power point and how to blog the power point. After he discuss he told us that were going to make our own power point and were going to blog it to him.After that he us time to done with our activity so we proceed to our activity to done it.After we done with our activity sir Beria check it after he check he dismiss the class and thats all for that day.....!
After the class we go the carenderiaan to our lunch.......!