1. Suppose that the small firm for which you designed the Netflix®-like DVD rental database has established an Internet presence and their sales are growing rapidly. You have been asked to examine the database to make sure that it can grow to support at least a million transactions per day. Your assignment is to examine the Oracle or MSSQL version of the Netflix schema provided that you have now reverse engineered into Visio or a database design tool of your choice, and identify performance and other scaling problems. The frequently occurring operational transactions are: · A customer updating their wish list on the web. · The receipt of a DVD, with the corresponding mailing of the next in-stock DVD from the customer’s wish list. · Monthly billing of each customer’s credit card. Within no more than two paragraphs : · What changes would you make to the provided schema so that it can scale to handle a million or more transactions per day? · What indexes would you add? · Would you denormalize? If you would denormalize, how would you maintain the denormalizations?

To address the scalability concerns and ensure that the database can handle a million or more transactions per day, several changes can be made to the provided schema.

Firstly, it is important to identify any performance bottlenecks and optimize them. This may involve analyzing the queries and identifying any inefficient or long-running operations. Techniques such as query optimization, indexing, and caching can be utilized to improve performance. Additionally, tuning the database configuration parameters, such as buffer size and connection pooling, can help improve scalability.

In terms of the schema design, denormalization can be considered to improve performance. Denormalization involves duplicating data or introducing redundant information to eliminate the need for joins and improve query execution time. This can be particularly useful for frequently accessed data. However, it is important to note that denormalization can also introduce the risk of data inconsistency if not properly maintained. Therefore, it is necessary to carefully evaluate the trade-offs and establish mechanisms to ensure data integrity.

In order to support the identified operational transactions, proper indexing is essential. Indexes can significantly improve query performance by allowing for faster data retrieval. Based on the specific types of queries and their frequency, index design should be optimized. For example, indexes can be added on attributes that are commonly used in search conditions, join operations, or sorting. It is important to strike a balance between the number of indexes and the overhead they introduce in terms of storage and maintenance.

Regarding the maintenance of denormalizations, several strategies can be employed. One approach is to develop triggers that update the denormalized data whenever the relevant source data is modified. This can ensure data consistency by automatically reflecting any changes made to the original data. Additionally, regular data integrity checks and validation processes can be implemented to identify and correct any inconsistencies that may arise. It is also crucial to have comprehensive documentation and well-defined procedures to ensure the correct handling of denormalized data.

Need your ASSIGNMENT done? Use our paper writing service to score better and meet your deadline.


Click Here to Make an Order Click Here to Hire a Writer