Software Industry
The
concept of software was developed well over 100 years earlier, in 19th century
England. Charles Babbage[58] came up with the Analytical Engine, which is
claimed to being the world’s first computer. The Difference Engine was more
complex than anything at the time, a machine that could solve second-level
differential equations. Alan Turing[59] conceived the Turing Machine in 1935,
or Decision Problem, which involved the relationship between mathematical
symbols and the quantities they represented. Designed by John W. Mauchly[60],
the first electronic computer was the ENIAC, which included the calculation of
ballistic tables, which were needed in enormous quantities to help the
artillery fire their weapons at the correct angles. But what is Software?[61].
Computer software, is that part of a computer system that consists of encoded
information or computer instructions, in contract to the physical hardware from
which the system is built. In computer science and software engineering,
computer software is all information processed by computer, programs and data.
Computer hardware and software require each other and neither can be
realistically used on its own. Software refers not just about programming and
programming languages. But about producing and selling the products made by
programming (languages) as well. In the beginning of so called “Information
Age” computers were programmed by “programming” direct instructions it. This
was done by setting switches or making connections to different logical units
by wires (circuitry). The first programming was done by typing in 1’s or 0’s
that were stored different information carriers. In the early 50’s programmers
started to let the machines do apart of the job. This was called automatic
coding and made live a lot easier for the early programmers. The first company
to provide software products and services was Computer Usage Company in 1955.
The word Software did not appear in print until 1960s. The industry did not
expand until the early 1960s, almost immediately after computers were first
sold in mass-produced quantities. In the early
1960s included Advanced Computer Techniques, Automatic Data Processing, Applied
Data Research, and Informatic General. The computer/hardware makers started
bundling operating systems, systems software and programming environment with
their machines. When Digital Equipment Corporation (DEC) brought a relatively
low-prices microcomputer to market, it brought computing within the reach of
many more companies and universities worldwide, and it spawned great innovation
in terms of new, powerful programming languages and methodologies. New software
was built for microcomputers, so other manufacturers including IBM, followed
DEC’s example quickly, resulting in the IBM AS/400 amongst others. The industry
expanded greatly with the rise of personal computer (PC) in the mid-1970s,
which brought desktop computing to the office worker for the first time. In the
following years, it also created a growing market for games, applications, and
utilities. DOS, Microsoft’s first operating system product, was the dominant
operating system at the time. AI[62] is the intelligence exhibited
by machines. An intelligent machine is a flexible rational agent that perceives
it’s environment and takes actions that maximize it’s chance of success at an
arbitrary goal. But for this to make reality new programming languages were
needed. And strange enough these languages were developed parallel to the other
languages, languages that could mimic intelligence.
Programming became an 80 hours a week job, and in the late 1980’s the
Graphical User Interfaces were created by the same manufacturers that made
software like C, Delphi, Clipper VO, and other languages to expedite the
creation of software. Though this kind of interface stemmed from as early as
the 1960, the idea never took off until the early 1990’s. The drag and drop
interface was introduced by the MacIntosh, and changed the world of PC’s
forever. During the ARPANET project, networks were developed in the late 1960s and
early 1970s using a variety of communications protocols. Led to the development
of protocols for internetworking, by which multiple separate networks could be
joined into a single network of networks. In 1982, the Internet protocol suite
(TCP/IP) was introduced as the standard networking protocol on the ARPANET. In
the 1980s, the work of British computer scientist Tim Berners-Lee on the World
Wide Web theorized protocols linking hypertext documents into a working system,
marking the beginning of the modern Internet. Since the mid- 1990s, the
Internet has had a revolutionary impact on culture and commerce, including the
rise of near-instant communication by electronic mail, instant messaging, voice
over Internet Protocol (VoIP) telephone calls, two-way interactive video calls,
and the World Wide Web with it's discussion forums, blogs, social networking,
and online shopping sites. The global communication landscape was almost
instant in historical terms: it only communicated 1% of the information flowing
through two-way telecommunications networks in the year 1993, already 51% by
2000, and more than 97% of the telecommunication information by 2007. Growth
continues today driven by ever greater amounts of online information, commerce,
entertainment, and social networking. During the 1990s, much attention was paid
in the IT industry to outsourcing, or the phenomenon of contracting out all or
part of IT function. High costs made it so that it was cheaper to develop in
low cost countries, growing to a whole new dimension known as “offshoring”.
Data communications and voice telephony costs are now so low, and bandwidth so
broad, and the Internet so ubiquitous, that it is a simple matter to run an
applications development center offshore. In the early years of the 21st
century, another successful business model has arisen for hosted software,
called software-as-a-service, or SaaS[63] is a software licensing and delivery
model in which software is licensed on a subscription basis and is centrally
hosted. It is sometimes referred to as “on-demand software”. SaaS is typically
accessed by users using a thin client via a web browser. It is part of the
nomenclature of cloud computing, along with infrastructure as a service,
platform as service, desktop as a service, managed software as a service,
mobile backend as service, and information technology management as a service.
The Software Industry is today going through major transformations; some will
call it a digital revolution. The increase in the mobile applications, the ease
of use, the inspiration of the cloud. The amount of data that is driven through
the organizations, connect and analyze concerns is causing a fundamental shift
in the software industry. The consumption of technology has been so that the
consumer tolerated the risk, but today the consumer wants to pay for value and
technology when they actually use that. Today, technology is crucial to success
in our everyday life. In the Future, technology will be ALL. The countries that
have the strongest IT industries will prevail over those that are
underdeveloped, or don’t understand the impact of technology in everyday’s
life.
No comments:
Post a Comment