Friday 29 June 2012

Discrete Mathematics Applications and Importance in Computer Science

Discrete mathematics is the branch of mathematics dealing with objects that can assume only distinct, separated values. The term "Discrete Mathematics" is therefore used in contrast with "Continuous Mathematics," which is the branch of mathematics dealing with objects that can vary smoothly (and which includes, for example, calculus). Whereas discrete objects can often be characterized by integers, continuous objects require real numbers.

The study of how discrete objects combine with one another and the probabilities of various outcomes is known as combinatorics. Other fields of mathematics that are considered to be part of discrete mathematics include graph theory and the theory of computation. Topics in number theory such as congruence’s and recurrence relations are also considered part of discrete mathematics.

The study of topics in discrete mathematics usually includes the study of algorithms, their implementations, and efficiencies. Discrete mathematics is the mathematical language of computer science, and as such, its importance has increased dramatically in recent decades.

The set of objects studied in discrete mathematics can be finite or infinite. The term finite mathematics is sometimes applied to parts of the field of discrete mathematics that deals with finite sets, particularly those areas relevant to business.

Research in discrete mathematics increased in the latter half of the twentieth century partly due to the development of digital computers which operate in discrete steps and store data in discrete bits. Concepts and notations from discrete mathematics are useful in studying and describing objects and problems in branches of computer science, such as computer algorithms, programming languages, cryptography, automated theorem proving, and software development. Conversely, computer implementations are significant in applying ideas from discrete mathematics to real-world problems, such as in operations research.

IMPORTANCE OF DISCRETE MATHEMATICS IN COMPUTER SCIENCE

Achieving working knowledge of many principles of computer science requires mastery of certain relevant mathematical concepts and skills. For example, A grasp of Boolean algebra including DeMorgans Law is useful for understanding Boolean expressions and the basics of combinational circuits concepts surrounding the growth of functions and summations are useful for analysis of loop control structures exposure to solving recurrence relations is de rigeur for the analysis of recursive algorithms and an introduction to proof methods facilitates consideration of program correctness and thinking rigorously in general.

Students are introduced to proof techniques before they begin to consider the idea of proving programs correct. They learn about propositional logic and Boolean algebra before they study some very elementary circuits and learn decision control structures and Boolean variables. They are introduced to predicate logic near the time they are beginning programming and learning about variables. They learn about growth of functions big-O notation and summations before they analyze loops and nested loops and they have the tools to begin algorithm analysis from the time they first begin to learn about iterative constructs. In conjunction with an introduction to number theory they do laboratory and programming exercises involving an assortment of integer algorithms.

Students learn about recursive definitions recurrence relations, analyzing recursive algorithms and writing recursive algorithms and programs together in the same course. They study matrices and matrix manipulations in conjunction with the array data structure. They learn about permutations and combinations, relations, graphs, and trees at the same time that their programming knowledge and sophistication are improving and they can do increasingly interesting programming exercises involving these concepts.

Discrete Mathematics

  • Propositional and predicate logic
  • Boolean algebra
  • Functions growth of functions big-O notation
  • Sequences and summations
  • Integers elementary number theory
  • Proof techniques direct indirect contradiction induction
  • Matrices
  • Counting
  • Pigeon hole principle
  • Permutations and combinations
  • Discrete probability
  • Recursive definitions recurrence relations
  • Relations : properties applications representation closures equivalence
  • Graphs : terminology representation isomorphism connectivity paths

Computer Science Topics

  • Computers and Computation
  • Computer System Organization
  • Abstraction problem solving algorithms
  • Logic gates and elementary circuits
  • Algorithm design analysis of simple loops
  • Representation (ASCII binary two’s complement floating point instructions
  • Algorithm correctness loop invariants pre-and post-conditions
  • Machine cycle
  • More algorithm correctness
  • Analysis of nested loops
  • Analysis of recursive algorithms
  • Run time organization run time stack
  • Searching and sorting
  • Trees binary search trees traversals

Programming Constructs and Concepts

  • Pascal program structure
  • Modularity and hierarchical design
  • Variables
  • Real char Boolean integer sub range and enumerated types
  • Input/output
  • Procedures functions parameter passing scope
  • Iterative control structures
  • Decision control structures
  • Text files
  • One dimensional arrays
  • Two dimensional arrays
  • Nested loops
  • Records
  • Pointers and linked lists
  • Recursion
  • Abstract data types stacks and queues

APPLICATIONS OF DISCRETE MATHEMATICS

Theoretical Computer Science

Theoretical computer science includes areas of discrete mathematics relevant to computing. It draws heavily on graph theory and logic. Included within theoretical computer science is the study of algorithms for computing mathematical results. Computability studies what can be computed in principle, and has close ties to logic, while complexity studies the time taken by computations. Automata theory and formal language theory are closely related to computability. Petri nets and process algebras are used to model computer systems, and methods from discrete mathematics are used in analyzing VLSI electronic circuits. Computational geometry applies algorithms to geometrical problems, while computer image analysis applies them to representations of images. Theoretical computer science also includes the study of various continuous computational topics.

Information Theory

Information theory involves the quantification of information. Closely related is coding theory which is used to design efficient and reliable data transmission and storage methods. Information theory also includes continuous topics such as analog signals, analog coding, analog encryption and Mathematical logic.

Information Life Cycle

Mathematical logic

Logic is the study of the principles of valid reasoning and inference, as well as of consistency, soundness, and completeness. For example, in most systems of logic (but not in intuitionistic logic) Peirce's law (((P→Q)→P)→P) is a theorem. For classical logic, it can be easily verified with a truth table. The study of mathematical proof is particularly important in logic, and has applications to automated theorem proving and formal verification of software.

Logical formulas are discrete structures, as are proofs, which form finite trees or, more generally, directed acyclic graph structures (with each inference step combining one or more premise branches to give a single conclusion). The truth values of logical formulas usually form a finite set, generally restricted to two values: true and false, but logic can also be continuous-valued, e.g., fuzzy logic. Concepts such as infinite proof trees or infinite derivation trees have also been studied e.g. infinitely logic.

Set theory

Set theory is the branch of mathematics that studies sets, which are collections of objects, such as {blue, white, and red} or the (infinite) set of all prime numbers. Partially ordered sets and sets with other relations have applications in several areas.

Set theory intersection of two sets

In discrete mathematics, countable sets (including finite sets) are the main focus. The beginning of set theory as a branch of mathematics is usually marked by Georg Cantor's work distinguishing between different kinds of infinite set, motivated by the study of trigonometric series, and further development of the theory of infinite sets is outside the scope of discrete mathematics. Indeed, contemporary work in descriptive set theory makes extensive use of traditional continuous mathematics.

Combinatorics

Combinatorics studies the way in which discrete structures can be combined or arranged. Enumerative combinatorics concentrates on counting the number of certain combinatorial objects - e.g. the twelvefold way provides a unified framework for counting permutations, combinations and partitions. Analytic combinatorics concerns the enumeration (i.e., determining the number) of combinatorial structures using tools from complex analysis and probability theory. In contrast with enumerative combinatorics which uses explicit combinatorial formula and generating functions to describe the results, analytic combinatorics aims at obtaining asymptotic formula. Design theory is a study of combinatorial designs, which are collections of subsets with certain intersection properties. Partition theory studies various enumeration and asymptotic problems related to integer partitions, and is closely related to q-series, special functions and orthogonal polynomials. Originally a part of number theory and analysis, partition theory is now considered a part of combinatorics or an independent field. Order theory is the study of partially ordered sets, both finite and infinite.

Discrete geometry and computational geometry

Graph theory

Graph theory, the study of graphs and networks, is often considered part of combinatorics, but has grown large enough and distinct enough, with its own kind of problems, to be regarded as a subject in its own right. Graphs are one of the prime objects of study in discrete mathematics. They are among the most ubiquitous models of both natural and human-made structures. They can model many types of relations and process dynamics in physical, biological and social systems. In computer science, they can represent networks of communication, data organization, computational devices, the flow of computation, etc. In mathematics, they are useful in geometry and certain parts of topology, e.g. knot theory. Algebraic graph theory has close links with group theory. There are also continuous graphs, however for the most part research in graph theory falls within the domain of discrete mathematics.

Graphs and Networks

Discrete probability theory

Discrete probability theory deals with events that occur in countable sample spaces. For example, count observations such as the numbers of birds in flocks comprise only natural number values {0, 1, 2,}. On the other hand, continuous observations such as the weights of birds comprise real number values and would typically be modeled by a continuous probability distribution such as the normal. Discrete probability distributions can be used to approximate continuous ones and vice versa. For highly constrained situations such as throwing dice or experiments with decks of cards, calculating the probability of events is basically enumerative.

Discrete probability theory dice

Number theory

Number theory is concerned with the properties of numbers in general, particularly integers. It has applications to cryptography, cryptanalysis, and cryptology, particularly with regard to modular arithmetic, Diophantine equations, linear and quadratic congruence’s, prime numbers and primarily testing. Other discrete aspects of number theory include geometry of numbers. In analytic number theory, techniques from continuous mathematics are also used. Topics that go beyond discrete objects include transcendental numbers, Diophantine approximation, and analysis and function fields.

Number Theory

Algebraic structures occur as both discrete examples and continuous examples. Discrete algebras include: Boolean algebra used in logic gates and programming; relational algebra used in databases; discrete and finite versions of groups, rings and fields are important in algebraic coding theory; discrete semi groups and monodies appear in the theory of formal languages.

Discrete geometry and computational geometry

Discrete geometry and combinatorial geometry are about combinatorial properties of discrete collections of geometrical objects. A long-standing topic in discrete geometry is tiling of the plane. Computational geometry applies algorithms to geometrical problems.

Trees

Trees are used to represent data that has some hierarchical relationship among the data elements.

Discrete Structure Tree

Topology

Although topology is the field of mathematics that formalizes and generalizes the intuitive notion of "continuous deformation" of objects, it gives rise to many discrete topics; this can be attributed in part to the focus on topological invariants, which themselves usually take discrete values. See combinatorial topology, topological graph theory, topological combinatorics, computational topology, discrete topological space, finite topological space, topology (chemistry).

Mesh Network Topology

CONCLUSIONS

We emphasize the essential role that mathematics plays in the development of computer science both for the particular knowledge and for the reasoning skills associated with mathematical maturity. We stress the importance of certain mathematical concepts for computer science. We present a comprehensive table of mathematics topics and their computer science applications.

Friday 24 February 2012

Von Neumann Architecture

The term Von Neumann Architecture, derives from a computer architecture proposal by the mathematician and early computer scientist John Von Neumann and others, dated June 30, 1945, entitled First Draft of a Report on the EDVAC. This describes a design architecture for an Electronic Digital Computer with subdivisions of a processing unit consisting of an arithmetic logic unit and processor registers, a control unit containing an instruction register and program counter, a memory to store both data and instructions, external mass storage, and input and output mechanisms. The meaning of the term has evolved to mean a stored-program computer in which an instruction fetch and a data operation cannot occur at the same time because they share a common bus. This is referred to as the Von Neumann bottleneck and often limits the performance of the system.

JOHN VON NEUMANN: THE FATHER OF THE MODERN COMPUTER

JOHN VON NEUMANN: THE FATHER OF THE MODERN COMPUTER

The heart of the Von Neumann Computer Architecture is the Central Processing Unit (CPU), consisting of the control unit and the ALU (Arithmetic and Logic Unit). The CPU interacts with a memory and an input/output (I/O) subsystem and executes a stream of instructions (the computer program) that process the data stored in memory and perform I/O operations. The key concept of the Von Neumann Architecture is that data and instructions are stored in the memory system in exactly the same way. Thus, the memory content is defined entirely by how it is interpreted. This is essential, for example, for a program compiler that translates a user-understandable programming language into the instruction stream understood by the machine. The output of the compiler is ordinary data. However, these data can then be executed by the CPU as instructions. A variety of instructions can be executed for moving and modifying data, and for controlling which instructions to execute next. The collection of instructions is called the instruction set, and, together with the resources needed for their execution, the instruction set architecture (ISA). The instruction execution is driven by a periodic clock signal. Although several sub steps have to be performed for the execution of each instruction, sophisticated CPU implementation technologies exist that can overlap these steps such that, ideally, one instruction can be executed per clock cycle. Clock rates of today's processors are in the range of 200 to 300 MHz allowing up to 600 million basic operations (such as adding two numbers or copying a data item to a storage location) to be performed per second.

With the continuing progress in technology, CPU speeds have increased rapidly. As a result, the limiting factors for the overall speed of a computer system are the much slower I/O operations and the memory system since the speed of these components have improved at a slower rate than CPU technology. Caches are an important means for improving the average speed of memory systems by keeping the most frequently used data in a fast memory that is close to the processor. Another factor hampering CPU speed increases is the inherently sequential nature of the Von Neumann instruction execution. Methods of executing several instructions simultaneously are being developed in the form of parallel processing architectures.

The design of Von Neumann Architecture is simpler than the more modern Harvard Architecture which is also a stored-program system but has one dedicated address and data buses for memory, and another set of address and data buses for fetching instructions.

A stored-program digital computer is one that keeps its programmed instructions, as well as its data, in read-write, random-access memory (RAM). Stored-program computers were advancement over the program-controlled computers of the 1940s, such as the Colossus and the ENIAC, which were programmed by setting switches and inserting patch leads to route data and to control signals between various functional units. In the vast majority of modern computers, the same memory is used for both data and program instructions.


The von Neumann Computer Model

Von Neumann Computer Systems contain three main building blocks:

  1. Central processing unit (CPU)
  2. Memory
  3. Input/output devices (I/O)

These three components are connected together using the System Bus.

The most prominent items within the CPU are the registers. They can be manipulated directly by a computer program.

Photobucket

Components of the Von Neumann Model

  1. Memory: Storage of information (data/program)
  2. Processing Unit: Computation/Processing of Information
  3. Input: Means of getting information into the computer. e.g. keyboard, mouse
  4. Output: Means of getting information out of the computer. e.g. printer, monitor
  5. Control Unit: Makes sure that all the other parts perform their tasks correctly and at the correct time
Components of the Von Neumann Model

Communication between Memory and Processing Unit

Communication between memory and processing unit consists of two registers:

  1. Memory Address Register (MAR)
  2. Memory Data Register (MDR)

To read

  1. The address of the location is put in MAR.
  2. The memory is enabled for a read.
  3. The value is put in MDR by the memory.

To write

  1. The address of the location is put in MAR.
  2. The data is put in MDR.
  3. The Write Enable signal is asserted.
  4. The value in MDR is written to the location specified.
Communication between Memory and Processing Unit

Memory Operations

There are two key operations on memory:

  1. Fetch (address) returns value without changing the value stored at that address.
  2. Store (address, value) writes new value into the cell at the given address.

This type of memory is random-access, meaning that CPU can access any value of the array at any time (vs. sequential access, like on a tape). Such memories are called RAM (random-access memory). Some memory is non-volatile, or read-only (ROM or read-only memory).

Memory Operations

ALU, the Processing Unit

  • Processing unit is hardware that implements Arithmetic and Logical Operations.
  • ALU stands for Arithmetic and Logic Unit, capable of performing ADD, SUBTRACT, AND, OR, and NOT Operations.
  • The size of input quantities of ALU is often referred to as word length of the computer.
  • Many processors today have word length of 32 and 64 bit.
  • Processing unit also includes a set of registers for temporary storage of data and memory addressing.
Photobucket

Control Unit

  • Manages the Processing Unit
  • Implemented as FSM
  • FSM directs all activity
  • Clock-based step-by-step processing, cycle-by-cycle
  • FSM is controlled by the
  1. Clock signal
  2. Instruction Register
  3. Reset signal

Input/output

  • I/O controller provides the necessary interface to I/O devices
  • Takes care of low-level, device-dependent details
  • Provides necessary electrical signal interface
Input/output Controller

Types of Von Neumann Computers Today

Today, the Von Neumann Scheme is the basic architecture of most computers appearing in many forms, including supercomputers, workstations, personal computers, and laptops.

Disadvantages of Von Neumann architecture

  • Every piece of data and instructions has to pass across the data bus in order to move from main memory into CPU (and back again). This is a problem because the data bus is a lot slower than the rate at which the CPU can carry out instructions. This is called the “Von Neumann Bottleneck”.
  • Both the data and programs share the same memory space.
  • This is a problem because it is quite easy foe a poorly written or faulty piece of code to write data into an area holding other instructions, so trashing that program.
  • The rate at which data needs to be fetched and the rate at which instructions need to be fetched are often very different. And yet they share the same bottlenecked data bus.

Conclusion

The Von Neumann Architecture has been incredibly successful, with most modern computers following the idea. You will find the CPU chip of a personal computer holding a control unit and the arithmetic logic unit (along with some local memory) and the main memory is in the form of RAM sticks located on the motherboard.
But the there are some basic problems with it. And because of these problems, other architectures have been developed.

Friday 27 January 2012

Brief Comparison between UNIX and LINUX

Introduction

The history of UNIX® dates back to 1969. Through the years, it has developed and evolved through a number of different versions and environments. Most modern UNIX variants known today are licensed versions of one of the original UNIX editions. Sun's Solaris, Hewlett-Packard's HP-UX, and IBM's AIX® are all flavors of UNIX that have their own unique elements and foundations. For example, Sun's Solaris is UNIX, but incorporates many tools and extensions designed to get the best out of Sun's own workstation and server hardware.

Linux® was born out of the desire to create a free software alternative to the commercial UNIX environments. Its history dates back to 1991, or further back to 1983, when the GNU project, whose original aims where to provide a free alternative to UNIX, was introduced. Linux runs on a much wider range of platforms than most UNIX environments, such as the Intel®/AMD led x86 platform. Most UNIX variants run on just one architecture.


Photobucket


What is it?

UNIX is an operating system that is very popular in universities, companies, big enterprises etc.

Linux is an example of Open Source software development and Free Operating System (OS).

Architectures:

UNIX is available on PA-RISC and Itanium machines.

Linux Originally developed for Intel's x86 hardware, ports available for over two dozen CPU types including ARM.

File system support:

UNIX: jfs, gpfs, hfs, ufs, xfs format

Linux: Ext2, Ext3, Ext4, Jfs, ReiserFS, Xfs, Btrfs format

Usage:

The UNIX operating system is used in internet servers and workstations.

Linux can be installed on a wide variety of computer hardware, ranging from mobile phones, tablet computers and video game consoles, to mainframes and supercomputers.

GUI:

Initially UNIX was a command based OS, but later a GUI was created called Common Desktop Environment.

Linux typically provides two GUIs, KDE and Gnome. But Linux GUI is optional.

Market share for Desktop PC:

UNIX Less than 0.5 percent of the PC market.

Linux The market share of Linux is about 0.8%.

Threat detection and solution:

UNIX Because of the proprietary nature of the original UNIX, users has to wait for a while, to get the proper bug fixing patch. But these are not as common.

Linux In case of Linux, threat detection and solution is very fast, as Linux is mainly community driven and whenever any Linux user post s any kind of threat, several developers start working on it from different parts of the world.

Cost:

Different flavors of UNIX have different cost structures. With the hardware included, a midrange UNIX server can cost anywhere in between $25,000 and $259,000 with high-end severs ranging up to $500,000.

Linux can be freely distributed, downloaded freely, distributed through magazines, Books etc. There are priced versions for Linux also, but they are normally cheaper than Windows.

Security:

A rough estimate of UNIX viruses is between 85 -120 viruses reported till date.
Linux has had about 60-100 viruses listed till date


Text mode interface:

UNIX Originally the Bourne Shell. Now it's compatible with many others including BASH.

BASH (Bourne Again Shell) is the Linux default shell. It can support multiple command interpreters.

Development and Distribution:

UNIX systems are divided into various other flavors, mostly developed by AT&T as well as various commercial vendors and non-profit organizations.

Linux is developed by Open Source development i.e. through sharing and collaboration of code and features through forums etc. and it is distributed by various vendors such as Debian, Red Hat, SUSE, Ubuntu, and GentuX etc.

User:

UNIX operating systems were developed mainly for mainframes, servers and workstations. The UNIX environment and the client-server program model were essential elements in the development of the Internet.

Linux operating system for everyone, from home users to developers and computer enthusiasts alike.

Kernel:

UNIX kernel is not freely available.

Linux kernel is freely available.

Patches:

UNIX patches available are highly tested.

Linux patches are not highly tested as UNIX patches.

UNIX Operating System Names:

A few popular names:

HP-UX, IBM AIX, Sun Solaris, Mac OS X, IRIX

Linux Distribution (Operating System) Names:

A few popular names:

Redhat Enterprise Linux, Fedora Linux, Debian Linux, Suse Enterprise Linux, Ubuntu Linux

Summary

Overall, the general environment between UNIX and Linux is very similar. Moving as a user or administrator from Linux to UNIX, or vice versa, brings some inconsistencies, but overall is fairly seamless. Even though the file systems or kernels might differ and require specialized knowledge to optimize, the tools and APIs are consistent. In general, these differences are no more drastic than variations among different versions of UNIX. All branches of UNIX and Linux have evolved and will be slightly different, but because of the maturity of the UNIX concept, the foundation doesn't change very much.

Saturday 21 January 2012

Information technology & It's Applications

What is Information Technology

Information Technology refers to anything related to computing technology that uses computing with high speed communication links to spread information form one place to an other. The interconnection of computers enable the people to send and receive information. The communication links are also used to interact with different people in the world.

Overview of Information Technology

We are living in the information age of a Global Village today. That means information is the key factor in this era and it is rightly said that "Information is the most precious commodity of today's day-to-day business". Everything Evolves around it whether it is education, medicine, history, geographical phenomena, Sports, Research or Business. You name the system and information is there to play a key role in its functionality and existence

"Data processing" OR "Computing " Information can be defined as the facts & Figures about anything i.e. the know-how about any object that exists and plays its role in any system. the system is any identified & known work that Accepts data/information into itself, manipulates in the shape of certain output(s) and delivers so that is become useful & meaningful. and precisely, that is what is known as "Data Processing" or "Computing" for which we need a computer to accomplish the task.

Modern Scenario

The modern impact of Information Technology has broadened the base of computing and communication through satellite, fiber-Optic, mobile phone, fax machine, multi-media/hyper-media, e-commerce, m-commerce etc. etc. Thus enhancing the implications of this shift from single, isolated technologies to a unified digital convergence and enhancing the computer users to experience a beautiful and fantastic scenario of computer utilization in the fields of I.T.

Applications of I.T.

i. Artificial Intelligence ii. Web-based Applications iii. E-commerce iv. M-commerce(Mobile Commerce) v. Computer Animation vi. Multi-media, Hyper-media vii. Distributed Computing

Artificial intelligence

1 The branch of computer science concerned with making computers behave like human-like qualities such as learning, seeing and hearing etc.

Web Based Applications

It is a type of software application that is available on the Web. The user can use it by connecting to the internet to save time, money and improve communication.

E-Commerce

Ecommerce or electronic commerce, a subset of e-business, is the purchasing, selling, and exchanging of goods and services over the Internet.

M-Commerce

Mobile commerce is the buying and selling of goods and services through wireless handheld devices such as cellular telephone and personal digital assistants (PDAs).

Computer Animation

It is a process to create moving images using computer.

Multimedia & Hypermedia

Multimedia is a collection of graphics, animations, audio and video presented by computer. Hypermedia is a process of creating links to files that contain photographs, audio, video and text etc.

Distributed Computing

It refers to multiple computer systems working on a single problem. The computers are networked, they can communicate with each other to solve the problem.

Summary

There are numerous fields of computer applications, the information technology has brought about a revolution in our life style. We may call it the Computer Revolution, Multi-Media Revolution or Whatsoever. So it is beyond any doubt that today, we are living in a society that is making use of “Information Highway” which is heading towards a real future “Global Village” of the Human history.