Since I spent my college years during early 90’s, I did not have any academic experience such as taking a class of computer literacy. Therefore, this week’s readings, which are full of jargons and abstract concepts from Computer Science, are hard to understand. However, here is my understanding about readings.
According to readings, a database is a collection of data built for specific information needs, and two ways to classify a database are by the type of their content and by their specific subject area. The primary concern of database is to increase its efficiency and interoperability within different application. Among methods of designing a database, the entity-relationship model by Chen is popular. This, the entity-relationship model, is like drawing a blueprint adequate for customer’s needs and shows relationships between every entity within database. In this method, Chen suggested that this method is very natural, not artificial, because this method has been existed since the Ancient Greece. However, it is not used widely as much as it is quoted in academic areas due to serious limitations that it needs to be renovated frequently, and it cannot be applied in pre-existing information sources. About the database normalization process, it is a useful practice to understand actual generating process of database. It is quite similar with a process of making a calculation table in MS Excel program even though it is much complicated. Briefily, the database normalization process is to analze each entity and break entities into small tables for preventing redunducy of data.
No comments:
Post a Comment