Académique Documents
Professionnel Documents
Culture Documents
Te extrao tanto
Companies found another disadvantage of database processing: vulnerability. If a file-processing system fails, only that particular application will be out of commission. But if the database fails, all of its dependent applications will be out of commission. Gradually, the situation improved. Hardware and software engineers learned how to build systems powerful enough to support many concurrent users and fast enough to keep up with the daily workload of transactions. New ways of controlling, protecting, and backing up the database were devised. Standard procedures for database processing evolved, and programmers learned how to write more efficient and more maintainable code. By the mid-1970s, databases could efficiently and reliably process organizational applications. Many of those applications are still running today, more than 25 years after their creation!
DBMS products were impractical until the 1980s, when faster computer hardware was developed and the price-performance ratio of computers fell dramatically. The relational model also seemed foreign to many programmers, who were accustomed to writing programs in which they processed data one record at a time. But relational DBMS products process data most naturally an entire table at a time. Accordingly, programmers had to learn a new way to thik ablut data processing. Because of these problems, even though the relational model had many advantages, it did not gain true popularity until computers became more powerful. In particular, as microcomputers entered the scene, more and more CPU cycles could be devoted to a single user. Such power was a boon to relational DBMS products and set the stage for the nest major database development.
One impact of the move of database technology to the micro was the dramatic improvement in DBMS user interfaces. Users of microcomputer systems are generally not MIS professionals, and they will not put up with the clumsy and awkward user interfaces common on mainframe DBMS products. Thus, as DBMS products were devised for micros, user interfaces had to be simplified and made easier to use. This was possible because micro DBMS products operate on dedicated computers and because more computer power was available to process the user interface. Today, DBMS products are rich and robust with graphical user interfaces such as Microsoft Windows. The combination of microcomputers, the relational model, and vastly improved user interfaces enabled database technology to move from an organizational context to a personal-computing context. When this occurred, the number of sites that used database technology exploded. In 1980 there were about 10,000 sites suing DBMS products in the United States. Today there are well over 20 million such sites! In the late 1980s, a new style of programming called object-oriented programming (OOP) began to be used, which has a substantially different orientation from that of traditional programming. In brief, the data structures processed with OOP are considerably more complex than those processed with traditional languages. These data structures also are difficult to store in existing relational DBMS products. As a comsequence, a new category of DBMS products called objectoriented database systems is evolving to store and process OOP data structures. For a variety of reasons, OOP has not yet been widely used for business information systems. First, it is difficult to use, and it is very expensive to develop OOP applications. Second, most organizations have milions or billions of bytes of data already organized in relational databases, and they are unwilling to bear the cost and resk required to convert those databases to an ODBMS format. Finally, most ODBMS have been developed to support engineering applications, and they do not have features and functions that are appropriate or readily adaptable to business information applications. Consequently, for the foreseeable future, ODBMS are likely to occupy a niche in commercial information systems applications.
(coordinating) the actions of independent CPUs), it led to a new style of multi-user database processing called the client-server database architecture. Not all database processing on a LAN is client-server processing. A simple, but less robust, mode of processing is called file-sharing architecture. A company like Treble Clef could most likely use either type since it is a small organization with modest processing requirements Larger workgroups, however, would require client-server processing.
Opening a database file This topic discusses Using the Open Query File (OPNQRYF) command and the Open Database File (OPNDBF) command to open database file members in a program. Examples, performance considerations, and guidelines to follow when writing a highlevel language program are also included. Also, typical errors that can occur are discussed. Basic database file operations in programs This topic discusses basic database operations. This discussion includes setting a position in the database file, and reading, updating, adding, and deleting records in a database file. A description of several ways to read database records is also included. Information about updating discusses how to change an existing database record in a logical or physical file. Information about adding a new record to a physical database member using the write operation is included. Closing a database file This topic includes ways you can close a database file when your program completes processing a database file member, disconnecting your program from the file. Monitoring database file errors in a program This topic includes messages to monitor when handling database file errors in a program.
In mature organizations, firms rely on a fairly welldeveloped enterprise data model. Large software packages for enterprise-wide management, such as Oracle's application software, and the SAP software from SAP, Inc., are examples of packages for enterprise development.
1. Data - represented as some type of entityrelationship or object-relationship diagram. 2. Processes - these manipulate data and are represented by data flow diagrams. 3. Network - this transports the data and is often shown as a schematic of network links. 4. People - who perform processes - key sources and receivers of data and information. 5. Events and Points-In-Time - when processes are performed - often shown as state-transition diagrams. 6. Reasons (also called Rules) - for events that govern the processing of data - listed in policy and procedure manuals.
System Planning.
Your book lists several top-down approaches for the development of systems based upon business strategies, etc. In reality, there is often not a high level of planning for systems. Systems are developed based upon both strategic and tactical needs of the firm and the firm's departments. Most systems development efforts follow a classical Systems Development Life Cycle. o 1. Project Identification - preliminary analysis of needs.
2. Project Initiation and Planning - identifies possible approaches to solving the business need. 3. Analysis - identify the functional specifications for the business needs - continue to confirm that the project is feasible. 4. Logical Design - detailed functional specifications in the form of screen designs, report layout designs, data models, processing rules, etc. 5. Physical Design - programming, development of physical data structures, redesigning parts of the organization to support the new system. 6. Implementation - sometimes includes testing or testing may be part of physical design. Includes a plan to cut over to the new system. 7. Maintenance - continued modification of the system as the firm's needs change.
Computer-aided Software Engineering (CASE) tools automate the development of physical components of a system and support the development effort by providing modeling tools. o Example: Oracle Designer to model processes and data. o Example: Oracle Developer to generate computer programs, reports and screens.
CASE tool products (diagrams, forms, reports, etc.) can usually be shared by many members of the project development team. A Repository stores the components that are developed by the team to enable the sharing.
Three-Schema Architecture.
This is the concept that there are different levels or views of the data that the firm uses. Level 0--Reality. This is the actual real data that managers use. Level 1--Conceptual Schema - also called a Conceptual Model. o This model is developed during the analysis stage of system development. o Typically this is some type of diagramming model of the data and the relationships among the data. It is an abstractions of reality to focus on the data the firm uses for specific applications. o Entity-Relationship Diagrams capture overall structure of the organization's data Chapters 3 and 4. Level 2--External Schema -- also termed a Logical Model. o This model is developed during the design stage and represents the actual implementation of the conceptual model with a specific DBMS product. o This includes table design and the application of rules for Normalization of data and translating conceptual model into relations Chapter 6.
Level 3--Physical or Internal Schema. o This is created during the physical design and implementation phase. o It includes the actual storage of data, creation of files that comprise the database including indexes and other structures Chapter 7.
Terminology.
Entities person, place, object, event, or concept in user environment. In Henrys Bookstore example, entities include Publishers, Branches, Authors, and Books. Entity Type often shortened to entity, e.g., Publisher. Entity Instance a single occurrence of an entity/entity type, e.g., Signet Publishing Company. Attributes properties or characteristics of an entity. In Henrys Bookstore example, attributes of interest for entity Book are Book Code, Title, Publisher Code, Type, Price and Paper Back. Relationships association between entities. In Henrys Bookstore example, Publisher and Book are related a Publisher is related to all Books it publishes, and a Book is related to its one Publisher.
Entities each entity is a table. Attributes each attribute is a column in the table for that entity.
Relationships tables share common columns that are used to link tables to one another.
Row of table equals an instance, occurrence, record. Column of table equals an attribute, characteristic, field. Is there a relationship between the two entities/tables? Yes, if they share a common column or set of common columns.
END OF NOTES.
Break down the database development process into its key steps. Describe the tasks associated with each of the key steps.
Database development is a systematic process that moves from concept to design to implementation. It also takes into account the needs of potential users and the operational and/or business processes in the organization.
1. Define business processes: Many database development efforts begin by defining the key business and/or operational processes within the organization. Developers first create high-level models showing the major activity steps associated with marketing, sales, production, human resource management, public relations, research and development, and so on. Taken together, these process maps represent an enterprise-wide model of the organization and its core processes.
Database Development Process: Step Two
2. Determine scope of database development effort: The next step in the database development effort is to select one process or a set of related processes for further analysis and improvement.
Knowledge Check
In general, what is the first step in the basic process used to develop a database? A. B. C. D. Define the information needs Develop conceptual design Define business processes Determine scope of database development effort
3. Define the information needs: Once a business process (or set of processes) has been selected, the next step is to define the information needs of users involved in or affected by the business process.
Database Development Process: Step Four
4. Develop conceptual design: A basic understanding of these needs is used to create a conceptual design for the database. At this stage, a conceptual data model is created that illustrates relationships between information sources, users, and business process steps.
5. Develop logical data model: The conceptual data model is used to develop a logical data model based on one of the primary DBMS types: relational, hierarchical, network, or object-oriented approaches.
Database Development Process: Step Six
6. Develop physical design: With the logical data model in hand, developers move to the physical design, which involves determining the specific storage and access methods and structures.
7. Create and test database: Once this step is complete, developers can go ahead and create the database using whatever DBMS has been selected. Small amounts of data can be entered into the database for testing purposes. This is also the time to start developing sample screens and reports to determine if the database design will meet the predefined requirements. It is much easier to revise and change the database during this testing phase, before all of the data have been entered. The term prototyping refers to the iterative process used to try different report formats and input screens to determine their suitability and effectiveness.
Lesson Wrap-Up
The full potential of a database will only be realized if it is properly planned and developed. Database development is a systematic process that takes into account the needs of the organization and potential users. This lesson provides a high-level roadmap of the basic steps that are followed to created effective databases. By now you should understand the purpose and logic of the database development process. Now that you have completed this lesson, you should be able to:
Break down the database development process into its key steps. Describe the tasks associated with each of the key steps.