CS Undergraduate Curriculum
|Educational Plan( pdf )||Bachelor Curriculum ( pdf ) ООП(pdf)|
CS 100: Introduction to Computer Science and Engineering
First part of the course introduces students to the profession, including the disciplines of chemical, civil, computer, electrical, environmental, and mechanical engineering. Prepares students for success through the integration of the following important skills: technical problem solving and engineering design, ethical decision-making, teamwork, and communicating to diverse audiences. Second part is aimed at students with little or no programming experience. It aims to provide students with an understanding of the role computation can play in solving problems. It also aims to help students, regardless of their major, to feel justifiably confident of their ability to write small programs that allow them to accomplish useful goals. The class will use the Python programming language.
Probability and statistics are two related but separate academic disciplines. Statistical analysis often uses probability distributions, and the two topics are often studied together. However, probability theory contains much that is of mostly of mathematical interest and not directly relevant to statistics. Moreover, many topics in statistics are independent of probability theory.
CS 107: Physics - 1 (Mechanics)
Introductory physics course, geared towards engineering majors. Equilibrium and motion of particles in one and two dimensions in the framework of Newtonian mechanics, force laws (including gravity), energy, momentum, rotational motion, conservation laws, and fluids. Examples will be drawn from astronomy, biology, sports, and current events. Prerequisites: Mathematics
CS 108: Physics - 2 (Electricity and Magnetism)
Physics course geared toward engineering majors. Electricity and Magnetism should provide instruction in each of the following five content areas: electrostatics; conductors, capacitors and dielectrics; electric circuits; Electric & magnetic fields; and electromagnetism, DC and AC circuitry.
Linear algebra is the branch of mathematics concerning finite or countably infinite dimensional vector spaces, as well as linear mappings between such spaces. Such an investigation is initially motivated by a system of linear equations in several unknowns. Such equations are naturally represented using the formalism of matrices and vectors.[1 Linear algebra is central to both pure and applied mathematics. For instance, abstract algebra arises by relaxing the axioms of a vector space, leading to a number of generalizations.
CS 105: Calculus - 1
This is a first course in the calculus of one variable intended for computer science, industrial engineering, industrial electronics and applied mathematics students. It is open to others who are qualified and desire a more rigorous mathematics course at the core level. Topics include a brief review of polynomials, trigonometric, exponential, and logarithmic functions, followed by discussion of limits, derivatives, and applications of differential calculus to real-world problem areas. An introduction to integration concludes the course.
CS 106: Calculus - 2
This is a second course in the calculus of one variable intended for computer science, industrial engineering, industrial electronics and applied mathematics students. It is open to others who are qualified and desire a more rigorous mathematics course at the core level. Topics include an overview of integration, basic techniques for integration, a variety of applications of integration, and an introduction to (systems of) differential equations.
Algorithm and programming
In mathematics and computer science, an algorithm is a step-by-step procedure for calculations. Algorithms are used for calculation, data processing, and automated reasoning. More precisely, an algorithm is an effective method expressed as a finite list of well-defined instructions for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, will proceed through a finite  number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input.
Mathematical logic and algorithm theory
Mathematical logic and algorithm theory(also known as symbolic logic) is a subfield of mathematics with close connections to the foundations of mathematics, theoretical computer science and philosophical logic. The field includes both the mathematical study of logic and the applications of formal logic to other areas of mathematics. The unifying themes in mathematical logic include the study of the expressive power of formal systems and the deductive power of formal proof systems. Mathematical logic is often divided into the fields of set theory, model theory, recursion theory, and proof theory. These areas share basic results on logic, particularly first-order logic, and definability. In computer science (particularly in the ACM Classification) mathematical logic encompasses additional topics not detailed in this article; see logic in computer science for those.
Discrete mathematics is mathematics that deals with discrete objects. Discrete objects are those which are separated from (not connected to/distinct from) each other. Integers (aka whole numbers), rational numbers (ones that can be expressed as the quotient of two integers), automobiles, houses, people etc. are all discrete objects. On the other hand real numbers which include rational as well as irrational numbers are not discrete. As you know between any two different real numbers there is another real number different from either of them. In this course we will be concerned with objects such as integers, propositions, sets, relations and functions, which are all discrete. We are going to learn concepts associated with them, their properties, and relationships among them among others.
In computer science, a data structure is a particular way of storing and organizing data in a computer so that it can be used efficiently.Different kinds of data structures are suited to different kinds of applications, and some are highly specialized to specific tasks. For example, B-trees are particularly well-suited for implementation of databases, while compilerimplementations usually use hash tables to look up identifiers.Data structures provide a means to manage huge amounts of data efficiently, such as large databases and internet indexing services. Usually, efficient data structures are a key to designing efficient algorithms. Some formal design methods and programming languages emphasize data structures, rather than algorithms, as the key organizing factor in software design.
A programming language is an artificial language designed to communicate instructions to a machine, particularly a computer. Programming languages can be used to create programs that control the behavior of a machine and/or to express algorithms precisely. The earliest programming languages predate the invention of the computer, and were used to direct the behavior of machines such as Jacquard looms and player pianos. Thousands of different programming languages have been created, mainly in the computer field, with many more being created every year. Most programming languages describe computation in an imperative style, i.e., as a sequence of commands, although some languages, such as those that support functional programming or logic programming, use alternative forms of description.
Electrotechnics and Electronics
The aim of the course is to teach students to evaluate the condition of electrical equipment and its adaptation potential as well as reveal the factors of operation improving, be able to choose the equipment professionally in order to increase productivity of foodstuff made of plants. The course consists of 3 interrelated modules representing the structural-functional communication of electric circuits theory, electromagnetic devices, fundamentals of electronics and electric measuring. The successive and systematic study of the course will provide students with knowledge of electrotechnics fundamentals, their connection with the principles of electrical equipment and electronic equipment building. The course should also teach students the essence of electromagnetic processes in an electric circuit, their dependence on energy sources and receivers as well as their significance for the efficient use of electrical equipment. This course is basic for all courses using knowledge on electrical equipment.
Engineering Computer Graphics
Ideally suited for designers, broadcasters, architects, engineers as well as anyone who would like to learn to create 2-dimensional and 3-dimensional study models with ease and sophistication, this course delivers a well rounded introduction to the power of Google’s SketchUp. Students are enabled to draw using a familiar pencil and paper paradigm in a software context. Google SketchUp makes 3D modeling easy enough for anyone to learn, and fast enough to use under real-world time constraints. SketchUp will allow students to demonstrate to clients what a new building will look like, recreate and fly through the scene of an accident, or visualize a theatrical set before it’s built. At the conclusion of this course, students will be comfortable creating, animating, and displaying 3D environments at a sophisticated level.
A differential equation is a mathematical equation for an unknown function of one or several variables that relates the values of the function itself and its derivatives of various orders. Differential equations play a prominent role in engineering, physics, economics, and other disciplines. Differential equations arise in many areas of science and technology, specifically whenever adeterministic relation involving some continuously varying quantities (modeled by functions) and their rates of change in space and/or time (expressed as derivatives) is known or postulated. This is illustrated in classical mechanics, where the motion of a body is described by its position and velocity as the time value varies. Newton's laws allow one (given the position, velocity, acceleration and various forces acting on the body) to express these variables dynamically as a differential equation for the unknown position of the body as a function of time. In some cases, this differential equation (called an equation of motion) may be solved explicitly.
Theory of Computation
Theory of Computation is theoretical computer science and mathematics, the theory of computation is the branch that deals with whether and how efficiently problems can be solved on a model of computation, using an algorithm. The field is divided into three major branches: automata theory,computability theory and computational complexity theory. In order to perform a rigorous study of computation, computer scientists work with a mathematical abstraction of computers called a model of computation. There are several models in use, but the most commonly examined is the Turing machine. Computer scientists study the Turing machine because it is simple to formulate, can be analyzed and used to prove results, and because it represents what many consider the most powerful possible "reasonable" model of computation (see Church–Turing thesis). It might seem that the potentially infinite memory capacity is an unrealizable attribute, but any decidable problem solved by a Turing machine will always require only a finite amount of memory. So in principle, any problem that can be solved (decided) by a Turing machine can be solved by a computer that has a bounded amount of memory.
Computer science and engineering, computer architecture refers to specification of the relationship between different hardware components of a computer system. It may also refer to the practical art of defining the structure and relationship of the subcomponents of a computer. As in the architecture of buildings, computer architecture can comprise many levels of information. The highest level of the definition conveys the concepts implement. Whereas in building architecture this over-view is normally visual, computer architecture is primarily logical, positing a conceptual system that serves a particular purpose. In both instances (building and computer), many levels of detail are required to completely specify a given implementation, and some of these details are often implied as common practice. For example, at a high level, computer architecture is concerned with how the central processing unit (CPU) acts and how it accessescomputer memory. Some currently (2011) fashionable computer architectures include cluster computing and Non-Uniform Memory Access.
Software engineering (SE) is the application of a systematic, disciplined, quantifiable approach to the design, development, operation, and maintenance of software, and the study of these approaches; that is, the application of engineering to software.   The termsoftware engineering first appeared in the 1968 NATO Software Engineering Conference, and was meant to provoke thought regarding the perceived "software crisis" at the time. Software development, a much used and more generic term, does not necessarily subsume the engineering paradigm. The field's future looks bright according to Money Magazine and Salary.com, which rated Software Engineer as the best job in the United States in 2006.In 2012, software engineering was again ranked as the best job in the United States, this time by CareerCast.com.
A microprocessor incorporates the functions of a computer's central processing unit (CPU) on a single integrated circuit (IC), or at most a few integrated circuits. It is a multipurpose,programmable device that accepts digital data as input, processes it according to instructions stored in its memory, and provides results as output. It is an example of sequential digital logic, as it has internal memory. Microprocessors operate on numbers and symbols represented in the binary numeral system. The advent of low-cost computers on integrated circuits has transformed modern society. General-purpose microprocessors in personal computers are used for computation, text editing, multimedia display, and communication over the Internet. Many more microprocessors are part of embedded systems, providing digital control of a myriad of objects from appliances to automobiles to cellular phones and industrial process control.
HTML/XHTML Over the years Web Technologies have continued to develop so that now, virtually any type of audio-visual media can be 'carried' via the Internet.Most of these technologies however, have effectively been 'bolted-on' to the original web page construction language; Hyper Text Markup Language (HTML). You'll notice most web addresses begin with 'http://' which means 'Hyper Text Transfer Protocol', and end with .HTM or .HTML. (In case you're wondering, the 'HTM' extension is for backward-compatibility with older 'Windows' DOS computers, which had a limit of only 3 letters).
Computer Organization refers to the level of abstraction above the digital logic level, but below the operating system level. At this level, the major components are functional units or subsystems that correspond to specific pieces of hardware built from the lower level building blocks described in the previous module. A closely related term, computer architecture, emphasizes the engineering decisions and tradeoffs that must be made in order to produce a "good" design. The computer architect answers questions like... How many registers should there be? What machine instructions should there be? How should the cache be organized? What hardware support should there be for virtual memory?
Operating System Concepts
An operating system (OS) is a collection of software that manages computer hardware resources and provides common services for computer programs. The operating system is a vital component of the system software in a computer system. Application programs require an operating system to function. Time-sharing operating systems schedule tasks for efficient use of the system and may also include accounting for cost allocation of processor time, mass storage, printing, and other resources. For hardware functions such as input and output and memory allocation, the operating system acts as an intermediary between programs and the computer hardware, although the application code is usually executed directly by the hardware and will frequently make asystem call to an OS function or be interrupted by it. Operating systems can be found on almost any device that contains a computer—from cellular phones and video game consolesto supercomputers and web servers.
Database Management System (DBMS)
A database is an organized collection of data, today typically in digital form. The data are typically organized to model relevant aspects of reality (for example, the availability of rooms in hotels), in a way that supports processes requiring this information (for example, finding a hotel with vacancies). The term database is correctly applied to the data and their supporting data structures, and not to the database management system(DBMS). The database data collection with DBMS is called a database system. The term database system implies that the data is managed to some level of quality (measured in terms of accuracy, availability, usability, and resilience) and this in turn often implies the use of a general-purpose database management system (DBMS). A general-purpose DBMS is typically a complex software system that meets many usage requirements to properly maintain its databases which are often large and complex. The utilization of databases is now so widespread that virtually every technology and product relies on databases and DBMSs for its development and commercialization, or even may have DBMS software embedded in it. Organizations and companies, from small to large, depend heavily on databases for their operations.
Computer Networking and Telecommunications
The telecommunications industry includes wireless technology, computer networks, telephone systems, television, and radio. Research facilities, manufacturers, and network providers fuel this fast-paced field. The application of electrical engineering and physics has revolutionized business and personal communications by increasing transmission rates, reliability, and range. Elements of human communication including data, voice, images and movies are now transmitted in seconds around the globe. In addition, advances in technology have significantly decreased the size of communication devices.
Robotics is the branch of technology that deals with the design, construction, operation and application of robots  and computer systems for their control, sensory feedback, and information processing. These technologies deal with automated machines that can take the place of humans, in hazardous or manufacturing processes, or simply just resemble humans. Many of today's robots are inspired by nature contributing to the field of bio-inspired robotics. The concept and creation of machines that could operate autonomously dates back to classical times, but research into the functionality and potential uses of robots did not grow substantially until the 20th century. Throughout history, robotics has been often seen to mimic human behavior, and often manage tasks in a similar fashion. Today, robotics is a rapidly growing field, as we continue to research, design, and build new robots that serve various practical purposes, whether domestically, commercially, or militarily. Many robots do jobs that are hazardous to people such as defusing bombs, exploring shipwrecks, and mines.
Object Oriented Programming
Object-oriented programming (OOP) is a programming paradigm using "objects" –i.e.instances of a class consisting of data fields and methods together with their interactions – to design applications and computer programs. Programming techniques may include features such as data abstraction, encapsulation, messaging, modularity, polymorphism, andinheritance. Many modern programming languages now support OOP, at least as an option.
Human–computer interaction (HCI) involves the study, planning, and design of the interaction between people (users) and computers. It is often regarded as the intersection of computer science, behavioral sciences, design and several other fields of study. The term was popularized by Card, Moran, and Newell in their seminal 1983 book, The Psychology of Human-Computer Interaction, although the authors first used the term in 1980, and the first known use was in 1975. The term connotes that, unlike other tools with only limited uses (such as a hammer, useful for driving nails, but not much else), a computer has many affordances for use and this takes place in an open-ended dialog between the user and the computer.
Artificial Intelligence (AI)
Artificial intelligence (AI) is the intelligence of machines and the branch of computer science that aims to create it. AI textbooks define the field as "the study and design of intelligent agents where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success. John McCarthy, who coined the term in 1955, defines it as the science and engineering of making intelligent machines." AI research is highly technical and specialized, deeply divided into subfields that often fail to communicate with each other. Some of the division is due to social and cultural factors: subfields have grown up around particular institutions and the work of individual researchers. AI research is also divided by several technical issues. There are subfields which are focused on the solution of specific problems, on one of several possible approaches, on the use of widely differing tools and towards the accomplishment of particular applications. The central problems of AI include such traits as reasoning, knowledge, planning, learning, communication, perception and the ability to move and manipulate objects. General intelligence (or strong AI ) is still among the field's long term goals.
Network technology refers to the hardware and software used to connect a group of two or more computers. In scope, it can encompass setting up peer-to-peer connection, through local area networks (lans) and even includes an understanding of how the internet and the world wide web function.
Information security means protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction. The terms information security, computer security and information assurance are frequently used interchangeably. These fields are interrelated often and share the common goals of protecting the confidentiality, integrity and availability of information; however, there are some subtle differences between them. These differences lie primarily in the approach to the subject, the methodologies used, and the areas of concentration. Information security is concerned with the confidentiality, integrity and availability of data regardless of the form the data may take: electronic, print, or other forms. Computer security can focus on ensuring the availability and correct operation of a computer system without concern for the information stored or processed by the computer. Information assurance focuses on the reasons for assurance that information is protected, and is thus reasoning about information security.
The publication Life Safety Code, known as NFPA 101, is a consensus standard widely adopted in the United States. It is administered, trademarked, copyrighted, and published by the National Fire Protection Association and, like many NFPA documents, is systematically revised on a three year cycle. Despite its title, the standard is not a legal code, is not published as an instrument of law, and has no statutory authority in its own right. However, it is deliberately crafted with language suitable for mandatory application to facilitate adoption into law by those empowered to do so.
Meteorology and Standardization
METEOROLOGY AND STANDARDIZATION GOST (Russian: ГОСТ) refers to a set of technical standards maintained by the Euro-Asian Council for Standardization, Meteorology and Certification (EASC), a regional standards organization operating under the auspices of the Commonwealth of Independent States (CIS). All sorts of regulated standards are included, with examples ranging from charting rules for design documentation to recipes and nutritional facts of Soviet-era brand names (which have now become generic, but may only be sold under the label if the technical standard is followed, or renamed if they are reformulated). GOST standards were originally developed by the government of the Soviet Union as part of its national standardization strategy. The word GOST (Russian: ГОСТ) is an acronym forgosudarstvennyy standart, which means state standard. The history of national standards in the USSR can be traced back to 1925, when a government agency, later named Gosstandart, was established and put in charge of writing, updating, publishing, and disseminating the standards. After World War II, the national standardization program went through a major transformation. The first GOST standard, GOST 1State Standardization System, was published in 1968. The following countries have adopted GOST standards in addition to their own, nationally developed standards: Russia, Belarus, Ukraine,Moldova, Kazakhstan, Azerbaijan, Armenia, Kyrgyzstan, Uzbekistan, Tajikistan, Georgia, and Turkmenistan.