The innovation in technology and the use of information in every form of business has made information technology a critical element of almost any business organization either for internal use or for facing the external world (customers and general public). The growth of electronic commerce and the use of sophisticated systems like Virtual Private Networks, 128-bit encryption, etc., has created the requirement for not only security and safety of the information being processed but above the quality in service through efficiency and quick response.
Hence it is no longer imperative to install a sophisticated information technology system but mainly to configure the installed system (either software or hardware) to provide optimum results continuously in order to achieve quality of service. This dissertation presents and investigation in this area and presents a few generic principles in configuring and achieving quality of service in the IT systems.
The research aims to throw light on the critical nature of Quality of service in information technology systems and the basic methods to improve the quality of service. Furthermore, the investigation in this dissertation will provide valuable input to decision makers in organizations to identify those essential elements required in an IT implementation within an organization for a given business purpose.
The aim of this report is to investigate and configure quality of service in an IT-based environment. This is accomplished by conducting the research in the light of the following objectives
1. To conduct a generic overview on Quality of Service and identify its critical nature in information technology service environment with academic resources.
2. To investigate into the quality of service in an IT environment through a case-study analysis using secondary resources like journals and company Profiles. The investigation throws light on the various aspects of quality of service with respect to IT environments through citing relevant examples and critically analysing them with secondary resources.
3. To conduct a primary research through implementing a windows 2000 server installation in a local area network and configuring MS SQL Server 2000 RDBMS system in the network.
The primary research for a topic of this nature will be productive and quantifiable only through a practical implementation as opposed to the trivial methods of conducting questionnaire or opinion poll, which will give accurate with respect to a business study and not in terms of the technical implementation. Hence the research is conducted through a real-time implementation of the Windows 2000 server installation in a Local area network of ten desktop computers. A detailed overview of the methodology and the results are discussed in Chapter 4 on configuration.
This is the current chapter that introduces the reader with the aim and objectives of the research. This chapter provides an overview of what the research is about to the reader.
The literature review commences with a generic overview on the concept of Quality service with critical analysis from academic resources. This is then followed by a detailed overview of quality of service in an IT based environment and the critical success factors that are defined by academic authors for achieving higher quality of service in information technology systems.
This chapter presents a case study analysis with secondary resources on the quality of service in the information technology systems. The case study provides a profound qualitative analysis on the topic with examples from various business cases where quality of service is achieved in an IT implementation both internal and external to an organization. The research is concluded with a critical analysis on communication using computers in VOIP (Voice over Internet Protocol) networks with secondary resources.
In this chapter a primary research implementation is conducted by installing Microsoft Windows 2000 server in a local area network and configuring MS SQL server 2000 RDBMS in the network The participating organization whose identity is withheld already have Windows NT in their Local Area Network and were intending to install a new operating system across their network. The details and the results of the implementation are discussed in this chapter.
In this chapter a critical analysis of the objectives against the research accomplished in the report is presented to the reader. The analysis will reveal the coherence of the research with the aim and objectives stated initially. This is then followed by a summary of the entire research and the conclusion to the dissertation. A few constructive recommendations for further research on the topic are provided at the end of this chapter.
Quality of Service is defined as the extent to which an organization can accomplish the task of serving its customers effectively and increase its productivity as argued by John Ward and Joe Peppard (2002) . It is interesting to note that the increase in globalisation and the extensive use of information technology in business process in every industry has increased the competition tremendously and hence in order to distinguish themselves from competing organizations, competitors are increasingly depending upon the service concept of business and have also realised that retaining the existing customer will leverage continuous business rather than identify new customers in the target market.
Furthermore, Quality of Service is considered as a way of increasing the performance of the organization through the use of innovative methods and technologies. For example, the use of information technology (computers and electronic processing) is one of methods adopted by many organizations to increase their quality of service through accurate processing of information using computers and providing timely services. Gerry Johnson and Kevan Scholes (2001) argues that the Quality of Service is the backbone of not only service-based organizations but to any company irrespective of the nature of business because it is clear that the business is always targeted to a customers either the general public or another business organization itself. In both the cases quality of service is imperative as argued by the authors.
Alongside, the forum for International Association for electrical and electronic engineers (IEEE) defines quality of service in terms of technology as the process of harnessing the most out of a technology implementation in an organization in order to increase productivity and overall performance of the client organization deploying the technology. Alongside, it is also justified by IEEE that the Quality of Service is imperative in any technology implementation within a company irrespective of whether it is IT-based or not mainly because of the fact that technology and innovations in technology are primarily to support business. John Ward (2002) further argues that business should drive technology in order to achieve sustainable competitive advantage in the target market and achieve market share.
Since this report is focused on the technical aspects of quality of service, further analysis on Quality of Service in the light of business is out of the scope of this report.
In a technical environment, David Kossmann (2003) argues that the Quality of Service is accomplished mainly by achieving accurate results that were agreed by the parties involved within the time schedule. This apparently creates room for two major aspects of Quality of Service in terms of technology
a. Accuracy and
David Kossmann (2003) argues that accuracy is achieved not only through achieving the expected results time and again but mainly accuracy in the overall process of developing the technology and implementation of the project so developed is the essential to achieve Quality of Service. John Ward (2002) further argues that in a technical implementation accuracy plays a critical role for not only achieving success in the project or quality of service but mainly to achieve reliability upon the system developed by the end users since it is obvious that a technology is developed targeting upon a group or groups of end users.
Apart from the above statements by eminent authors, another important aspect of accuracy with respect to technical project implementation is that the output agreed must not only provide accuracy in terms of value (quantitative result) but also in terms of the content of the output (qualitative result) as argued by R. C. Hibbler (2001) . The fact that quality of the output is normally measured in by analysing the entire process of the project itself rather than the final output makes it critical for maintaining accuracy at all levels of the project. A further overview on this area with respect to information technology and computers is presented in section 2.2.
Time is a critical factor in any project implementation since R. C. Hibbler (2001) argues that a project’s success is not only in its quality or accuracy of the output but essentially on the time deadlines and the ability to meet the agreed time deadlines. The increase in competition and the continuous innovation in technology has apparently in increased the need for rapid development and implementation of the projects both computer based and non IT projects because of the reason that a company can leverage competitive advantage and increase its revenue through the implementation only when it is a first starter in the business.
The success of www.yahoo.com - the leading e-mail service provider and Internet portal across the globe justifies this statement. Alongside, the growth of the motor company General Motors is through continuous innovation in manufacturing combined with timely entry into the market for purchase by the customers thus eliminating competition. Even though, the report is focused on Quality of Service in technology aspects, the fact that business embraces technology as stated before is the reason for the aforementioned statements on General Motors and Yahoo.
It is further intriguing to note that A Kemper (2002) has argued that punctuality in delivery of products to the market is one part of the entire project but essentially punctuality in a technical project reflects upon the ability of the project implementation to provide accurate results time and again within the stipulated time frame that was initially designed for. The example cited by A. Kemper (2002) on Ferrari range of cars on providing high performance to the owners every time justifies the above statement. Hence it is imperative that the Quality of Service is not only in delivering the end product within the agreed time frame but mainly to provide output on the agreed time targets of the system every time it is put into operation.
The increase in the competition in business and the continuous innovation in technology has apparently increased the use of combination technologies in every segment of business and technology including information technology. In a generic perspective this combined use of technologies apparently increases the need for a new project or product delivered by a third party to work efficiently not only on its own but mainly when utilised as part of the entire technology as whole within the company. R. C. Hibbler (2001) has stated that the Quality of Service of the technological innovations and new projects conducted by leading research companies specialising in various areas of technology will be successful only when the end product is capable of providing the desired performance when deployed in combination with other technologies or in other words when the project is implemented as part of a bigger project itself.
The implementation of a 128-bit encryption method in the electronic commerce transactions over the Internet was a successful implementation only when the technique performed efficiently in line with the software being used by the Internet service providers and the company to conduct the transaction itself. This justifies that Quality of Service is not only assessed upon the individual performance of an end product but mainly when it functions efficiently on deploying in combination with other technologies.
From the above arguments it is clear that the Quality of Service is a critical element for achieving success in the business as well as gaining competitive advantage in the market both for the business as well for the vendors of the technology. In the next section a profound overview on the Quality of Service with in information technology and computer science is presented to the reader.
The increase in the innovation and competition in the business has apparently forced the competitors both at national and international levels to utilise information technology products to increase the quality of service in the business and achieve competitive advantage. In this section a detailed overview on the Quality of Service in IT-based products and projects is presented in the light of technology.
Along with the elements of accuracy and punctuality mentioned in section 2.1.2, a critical factor that signifies especially for IT-based projects and IT related products is flexibility as stated by Efraim Turban et al (2004) . John Ward and Joe Peppard (2002) argue that flexibility in an IT-based product is mainly the case of incorporating new changes within the system without hindering the primary system or the business process itself. This further explains that the flexibility of an IT-project is essential not only to operate as part of a bigger software system but also essentially to provide enough room for further development and integration of the project with respect to changes in business.
Flexibility in an IT based implementation is mainly concerned with the interoperability and the ease with which it can be configured with another foreign system. The evolution of J2EE (JAVA 2 Enterprise Edition) from Sun Microsystems and the .NET framework from Microsoft Inc are classical examples for incorporating flexibility and interoperability within the system.
It is further argued that not only interoperability is a criterion for flexibility in the IT-projects mainly because of the volatile nature of the information technology products itself due to continuous innovation and increased deployment of the policy of updating and providing patches. Joanne O. Cooper (2003) comments that information technology has grown into a form of service from its initial conception of just a supporting tool for business organizations mainly because of the increasing dependency of the competing organizations on information and above all the increasing defects in a project which when otherwise replaced completely will increase the operating costs and the investment of the organization eventually reducing the return on investment (ROI). The above statement clearly justifies that the flexibility in an information technology project is not just in the form of interoperability but mainly to include any relevant updates without having the hustle of longer system outage and recovery.
The initial years of the boom in information technology during the 1990s perceived Just in Time delivery mainly as the idea of delivering the end product of the developed software on time conforming to the agreed standards. Joanne O. Cooper (2003) in her research on the Quality of Service in information technology systems further argues that the twenty-first century has not only increased the rate at which a technology is being replaced but also increased the demands from an IF project from just timely delivery and accuracy of the software to the integral operation of the entire project for the given business case and enable the client organization that commissioned or invested in the project to achieve the Just in Time delivery of their company’s core product through the deployment of the software or hardware project so produced.
This clearly justifies that the Quality of Service with respect to an information technology project either software or hardware in today’s business environment can be achieved only through the efficient development of the project embracing the core requirements of the business itself rather than the individual organization.
Furthermore, the concept of just in time delivery is extended to address the issues of system outage and backup recovery. System outage mainly concerns with the time involved in the suspension of the services provided by the IT system installed where the parties involved in the usage of the system both high-level users and basic users need to be notified and a time frame should be agreed for the delivery of the system which is after service/maintenance or after upgrading with necessary patches in case of software system outage. From the above statement it is intriguing to note that the demand from IT systems in today’s business environment is increasing not only because of the costs involved in purchasing or developing a system but mainly in the maintenance of the system so developed and installed.
The increasing expenditure on information technology and related products by organizations to develop and maintain are the primary reasons for the demand from the clients or the end users to provide higher Quality of Service through accomplishing any genuine updating or servicing of an installed IT-product either hardware (where the outage primarily concerns with the changes in the physical systems itself) or software (where the outage is concerned with software product installed in the company).
Stability is a critical issue in any technical implementation. This is mainly because of the fact that only a stable system can be assessed against a set of criteria for deriving upon the Quality of Service from the system so implemented irrespective of the industry or the technology involved. In the case of information technology this is more critical as argued by Joanne O. Cooper (2003) since the stability of the IT system is not only essential for assessing the Quality of Service but mainly to maintain the accuracy of information and check the ability of the system to respond to any serious situation like a virus attack or hacking by external entities.
Stewart Robinson, (1995) has criticised that IT-projects implemented in the real-time business situation invariably lack stability right from a basic desktop application to a network wide distributed software installation unless they are debugged and tested against various test cases comprising of different situations faced by the business on a day-to-day basis and by handling any exception that arises like division by zero. The increase in the complexity of a software language like J2EE where the software developer has to write hundreds even thousands of lines of codes to implement a project must be able to handle the exceptions in order to prevent any bugs within the system.
This justifies that Quality of Service in an IT environment is not just the quality of the end product in performance on its own but mainly in tandem with the business requirements of the client to deliver high level of performance and efficiency in the business process itself.
From the above overview on the Quality of Service it is clear that the Quality of Service in a the implementation or development of a technology primarily concerns with accuracy and the time deadlines involved whilst in case of an IT-product where the end-product is actually the supporting element for another business, the Quality of Service is complex with the increasing complexity of the system itself. In the next chapter a secondary research analysis in the form of a critical investigation in the Quality of Service of IT systems with examples from various software vendors and real-time software implementations are presented to the reader.
Right from the early years of using a personal computer for word processing up to the electronic commerce era of the twenty-first century, the companies competing in the IT product development as well as those organizations utilizing IT-products (like a customised software implemented local to a given network of a company) have increasingly invested in IT for achieving higher performance and speedy processing of information. The market report by Keynote on the Computers and computer-related products for the year 2004 has revealed that the companies involved in IT business are facing stiff competition from India which has become the hub for information technology outsourcing across the globe.
Since the IT business itself has various segments like desktop programming, distributed computing, networking, communications, Internet and telephony, electronic commerce to name a few the investigation is carried out primarily on relational database products (RDBMS) including Microsoft SQL Server, IBM DB2 Universal Database and telephony through VOIP which is an emerging technology in IT and networking. As mentioned before the investigation is primarily based upon journals, research reports and company profiles.
It is clear that the entire information technology irrespective of its complexity or the level of application is dependant on ‘Data’ as argued by Efraim Turban et al (2004).
This has apparently made the need for securing the information and maintaining consistency of the data across the whole company itself.
The word database as identified by David Flanagan (2003) is typically a file that holds or stores data in and presents it to the users in the form of information (processed data). Relational databases in general are those databases, which maintain the relationship between various elements of the information held in different tables, and maintain the consistency of the information through the efficient manipulation of the relationships established within the database.
The demand from relational databases is not just the data consistency in the day-to-day business. The two critical areas that are expected to be addressed by the Relational databases are
1. Storage Management and
2. Backup recovery
Storage Management: Storage management is mainly concerned with the efficient use of storage area by the databases. The use of Content Manager and Tivoli Storage Management by IBM DB2 Universal Database to configure and store information in a three tier architecture format in order to protect the information consistency as well as to prevent unauthorised access is a classical example for the deployment of Quality of Service through the use of relational database systems in the IT-implementations within a company or in the client organizations. Julian Curtti et al (2004) state that the content manager deploys the IBM content deploys the three-tier architecture through the use of three essential elements in the storage management system for capturing and storing the information processed by DB2
Universal Database. They are
a. Browser: This serves as the front-end to the entire systems, which is seen and manipulated by the users. The users send their request to retrieve information through the content manager browser services. Alongside, the browser services not only serve as a user-friendly system with interactive screens, it actually performs the initial process of validating and authenticating the users to gain access to the Library server itself.
b. Library Server: The Library server is the storage system that holds the information on the location of where the data/information is exactly stored in the storage media. This is because of the intriguing fact that the among of information handled by IBM DB2 Universal Database version 8.0 and higher is limitless and can grow up to infinity through the combined use of the IBM content Manager with the Tivoli Storage Management. The actions of the library server is first to authorise the users who were initially authenticated by the browser services to access the information requested based upon role their user-profile is configured for. The configuration testing in the next chapter will throw light on this area. The Library server accomplishes the authorisation of the users to access the data through the Role-Base Access Control (RBAC) technique, which was initially deployed by the Department of Defence (DOD), United States of America since 1980s.
c. Resource Manager: This is the final tier where the actual information is stored mapped exactly in line with the address held in the Library server. The resource manager does not have an authorisation of its own but the two-stage authentication and authorisation up to the end of the library server accomplishes the task of preventing unauthorised access to the data. The fascinating fact about the Content Manager is that a single Content Manager installation with IBM DB2 Universal Database can map as many resource managers as required in order to facilitate the data warehousing services demanded by the users and business analysts for the purpose of forecasting and statistical analysis.
The above arguments on the Content Manager for IBM DB2 relational database system proves that the Quality of Service in the field of data warehousing is accomplished by the use of a three tier architecture system to manage any amount of information held by a client organization. the fact that a three tire architecture implementation is essentially a programming methodology adopted for creating client server applications with the database as a tire is used within the relational database system alone proves that the competitors are keen in achieving high Quality of Service through the combined use of methods and techniques.
Microsoft SQL Server 2000 on the other hand is another relational database system that is investigated here in this section. The success of .NET framework from Microsoft since its release in the year 2001 has actually revolutionised the entire method of processing information and data transfer between two points.
Jeffery Shapiro (2000) says that the Microsoft SQL Server 2000 which was aimed to server the relational database requirements of the windows system actually serves as a comprehensive data warehousing as well as a robust RDBMS that can not only operate within the network but can be deployed easily over the Internet without any interruptions and issues of hacking. Furthermore R. Berumandin et al (2003) have justified that the SQL Server 2000 when implemented in a .NET framework provides versatile capabilities to the developer to integrate the information stored in a database at a remote location. The integration facilities through interoperability and remote database access within the .NET architecture itself has increased its popularity in the Internet business and is impeccably versatile as argued by R.
Berumandin et al (2003). A classical example for the successful implementation of the .NET architecture with the SQL Server 200 relational database system is that of the electronic commerce venture of Marks and Spencer Plc the retail giant in UK (Microsoft, 2003) . The implementation of the .NET framework in the company as early as 2002when the product was at its initial stages of propaganda in the market has increased the electronic business sales of Marks and Spencer by 15% (Annual Report, 2004) in the year ending April 2004 itself. Furthermore, the use of the MS SQL Server 2000 relational database system for the purpose of storage and manipulation has reduced the use of middleware in the electronic commerce establishment of Marks and Spencer.
Alongside, not only the increase in sales in the year 2003 at the electronic front but the use of the information stored into the databases through remote access by the store managers and other corporate level decision makers to analyse the information and effectively forecast the demand has increased the efficiency of the supply chain of the company extremely well when compared to the year 2002 when the company suffered huge losses due to communication gaps in the supply chain leading to shortage of essential summer range of clothing released by Marks and Spencer Plc for the year 2002 (Annual Report, 2001). The increase in sales for the financial year 2003 seen in the annual report 2004 for the year ending April 2004 justifies the above statement. The Chairman’s message in the annual report of the company for the year ending April 2005 that the company’s performance in the e-business is appreciably well since the establishment of the Microsoft .NET architecture in the year 2003 further justifies the Quality of Service achieved by Microsoft in their products.
The above arguments have provided comprehensive research information on the ways of achieving Quality of Service in relational database systems. But it is also essential to analyse the application development methodologies and how the organizations have achieved Quality of Service in this area of IT systems development.
The changes in the information era especially with the technologies being deployed for developing a certain application is very high which has forced the need for a Rapid Systems Development strategy to be adopted in developing IT projects in order to deliver optimum Quality of Service for the investment on a specific product as well as meeting the competition in the market. Apart from the need for addressing the increasing level of changes in the IT systems environment by the vendors who supply the product, the need for the clients to meet the competition as quickly as possible by becoming the best starter in the target market and gain accelerating rate of business development is essential in the highly competitive global environment of the twenty-first century.
Hargrave D (1996) further argues that a structured approach to the application development is a successful strategy for smaller applications (i.e.) those applications that can be developed quickly even using the trivial methods but for those applications that are mission critical and are enterprise wide, the development must be rapid as well as objective in order to reach the market quickly. If the need to reach the market quickly by the client organization is to achieve competitive edge in the market by harnessing the initial demand, the need for IT vendors in the light of rapid systems development is mainly to increase the Quality of Service. Hargrave D (1996) argues that by providing a rapid solution for a given business situation, the IT vendor not only gains the satisfaction of the existing client but actually gains a positive performance rating in the external audit which is being constantly monitored by organizations in pursuit of a competitive IT service provider to cater their business requirements.
Also, the rapid change in the technology and the continuous release of patches by the software vendors like Microsoft, IBM an other leading conglomerates, has increased the pressure among the companies providing software solutions to keep themselves updated as well as provide support and service by constantly updating the installed application for the client organization in line with the updates in the technology and its critical nature to the business. Hargrave D (1996) argues that the use of object oriented method to develop complex systems will not only increase the speed of application development but mainly reduces any redundancy in creating and updating the application developed at the backend of the system.
This is mainly because of the fact that any requested changes need to be implemented only at the class and this will immediately take effect in the sub-classes and other objects that are created as instance from either the parent class or the inherited objects. The approach of handling any requirement within a container class by J2EE has increased the speed of development as well as introducing any changes to the system even after development. This proves that Quality of Service is integral to application development and not restricted with the databases and data retrieval alone.
The above sections presented a profound investigation upon achieving Quality of Service with respect to IT-software systems and development of software packages using different commercial application development methods and tools. But it is known fact that a software program can be executed only when the computer or the desktop supports the software and has the capability to run the software in its existing configuration without any hindrance to the business process of the client organizations itself. The increase in network based computing an the use of cost effective methods of communication like VOIP (Voice Over Internet Protocol) through ISDN or ATM networking services is a giant leap in the use of information technology to increase the overall performance of the system as well as the company that employs the system.
The deployment of Virtual Private Networks and sophisticated protocols like Frame Relay, etc, has apparently increased the demand from the systems with the client organizations striving for reducing costs in every sphere of their business. The concept of transferring voice over Internet in the form of digital signals (i.e.) digitising the antilog audio data and transferring it as a trivial piece of data to the other end will not only increase the return on investment from the installed system but also reduce the costs involved with the use of telephone services for communication (James J. Jiang et al, 2005) .
The need for precise and accurate transfer of vice without any interruption and delay are the primary aspects that are being researched and developed by the competing vendor organizations like CISCO, AT&T etc., the fact that the process of communication between two remote points in using computers depends on the concept packet switching where the data or information either video, audio or the normal data file is transferred in the form of fixed size packets of data rather than a bulk of the file in one stretch. James J. Jiang et al (2004) further argues that the three major areas of achieving Quality of Service with respect to the VOIP services where the voice transferred is live communication between two persons as follows
Quality of Service within a single network element (for example, queuing, scheduling, and traffic shaping tools)
Quality of Service in signalling techniques for coordinating QoS from end-to-end between network elements
Quality of Service policy, management, and accounting functions to control and administer end-to-end traffic across a network
The above-mentioned statements clearly justify that the Quality of Service is a critical element for achieving successful implementation in a communication system involving VOIP or any other data transfer techniques.
The investigation using the secondary resources has proved that Quality of Service is a critical element for achieving leadership in an information technology market. Also, it is clear from the investigation that the information technology in an organization is increasingly becoming an integral part of the effective functioning of the company hence increasing the demand from the vendors eventually making IT-projects more focused towards service than just a technical implementation. The configuration exercise in the next chapter will throw light on achieving Quality of Service internally in an organization as well as improvement in its business through the installation and configuration of Microsoft Windows 2000 Operating system and MS SQL Server 2000 relational database system.
The research is aimed to be conducted on the company Express Plc (Company name withheld under the client’s request) by installing Microsoft Windows 2000 server operating system and configuring MS SQL Server 2000 relational database system for access of data within the organization. The participating organization was approached by the author of this report further to an advertisement on the local newspaper requiring professionals for software implementation.
Express Plc is a small scale industry specialising in the sales of DIY products to the customers in the West Midlands area. Even though the company does not have an online transaction system and only has a web presence, the orders are taken over the phone by the customer service clients in the company. The company’s existing infrastructure is as follows
The company has network of ten desktop computers with one of them as a Non-dedicated server holding the Relational Database system MS SQL Server 7.0
The company employs five to eight customer service associates in a eight-hour working day based upon the demand and density of the calls. The customers have a Visual Basic front-end screen where they enter the customer and order details. This information is viewed by the despatch team who deliver the goods after processing the payment. The configuration exercise does not include the payment processing system since it is a separate computer dedicated for the purpose and the company was not willing to make any alterations with respect to payment processing.
The problem faced by the company is that the process of entering the data into the system is increasingly slow. Since Microsoft has withdrawn patches and update services for Windows NT Operating systems, the company could not keep the network updated to function smoothly. Hence it was decided to upgrade the operating system to Microsoft Windows 2000 server. Another problem faced by the company was that the operating of a Non-dedicated server which increases the time of processing the data and transfer of information since the same desktop acts as a desktop personal computer as well as the database server to the entire network.
Another issues faced by the company is that the retrieval of information and connection to SQL Serer 7.0 database was frequently failing and this has increased the time spent by the customer service associates in successfully entering the details for a certain order they were then dealing with. The cause for this problem maws identified as the lack of the server to provide multiple connections to the client computers due to the restrictions inherent to Windows NT operating system itself which limits the number of connections to the server based upon the performance of the system. It is known that in order to transfer information between the server and the client a connection should be established to conduct the transaction and the connection must be active until the transfer of information (termed as transaction) is completed. The Open Database Connectivity (ODBC) feature of the Windows operating system facilitates this connection in a client server environment. A detailed analysis of the ODBC concepts and database design are out of the scope of this research exercise.
Apart from the above problems another critical issue faced by the end-users (i.e.) the customer service associates who actually enter the data into the system is that the system does not always store the information into the database since the details of breakage in a connection during the middle of a transaction is available only after a appreciable period of time by when the user would have completed the conversation with the customer and started serving the next customer. This problem of redundancy was also mainly because of the lack of connections within the system itself to provide efficient service to maintain the accuracy of the information being processed and eliminate redundancy.
As stated above, the windows NT Operating system installed across the network was agreed to be replaced by Windows 2000 networking operating system. The fact that the Windows 2000 operating system is based on the Windows NT technology apparently eliminates the hustle of making major hardware upgrades. It is worth mentioning that the company had IBM desktop personal computers with Pentium processors and exceeding the minimum requirements for Windows 2000 operating systems with respect to memory and process speed requirements. Hence a hardware upgrade is not part of the actual installation itself.
In order to accomplish the front-end Visual Basic software, which the company was intending to retain, the software vendor who provided the software was commissioned to perform the necessary patches for the Visual Basic Application to work smoothly in the new operating system configuration.
Also in order to increase the speed of the processing, server was decided to be a dedicated server and none of the end user operations are performed in the server. Only the administrators were allowed access to the server in order to eliminate accidental deleting of files and unauthorised access.
The complete installation and reconfiguration of the system was planned to be accomplished within seven working days. In order to meet any unforeseen events or problems, a total outage period of ten working days was agreed with the company’s management. The entire installation and configuration plan is tabulated below with the deadlines adhered.
Action Proposed Time Actual Time
Install the Windows 200 server and configure the client nodes into the Windows 2000 network. 3 days 2 days
Install the MS SQL Server 2000 on the server and configure the client server connections for all the nine nodes in the network 3 days 2 days
Re configure the Visual Basic application by installing the patches to work with the existing set up and testing the same 2 days 1 day
Trial run of the system. 2 days 2 days
The entire installation and configuration was accomplished in seven working days as planned hence leaving three working days of effective business for the company.
The installation of the system with a dedicated server for the database and network monitoring has increased the speed with which the information was transferred to and from the server and client nodes. This has apparently increased the number of calls handled by a customer service associate from just ten effective calls per hour to 15 effective calls per hour. On extrapolating this value to the entire network, the total number of calls handled by the company during busy hours has increased from one hundred calls to one hundred and thirty five calls increasing the performance by approximately 73%.
The installation of the patch for the Visual Basic Application in the front-end system has not only reduced the costs of developing a new software application but also reduced the costs of training as well as the costs involved with the longer system outage eventually hindering the entire business process itself. Alongside, the deployment of the same front-end application has also increased the level of performance by the end-users (i.e.) customer service professionals, who were familiar with the various screens and the functionality of the system.
Alongside, the accuracy of the information was also highly increased because of the efficient management of the connections to the database server and the client computers through the use of a dedicated server. The initial problems of loss of data due to the termination of connection with the database server were completely eliminated since the Windows 2000 server operating system has the ability to handle multiple connections whilst the MS SQL Server 2000 relational database can serve data to all the connections though dynamic locking of records rather than the entire recordset.
The Quality of Service through the above configuration was achieve in the following areas
The efficient performance of the Windows 2000 server and MS SQL Server 2000 relational database from Microsoft has apparently increased the Quality of Service levels of Microsoft in its products. This has also increased the Quality of Service within the company itself by efficient performance of the customer service staff as well as the permanents processing and dispatch sections of the company since the consistency of the data initially entered was accomplished through the efficient deployment of the MS SQL Server 2000 across the network.
Alongside, the increase in the number of calls handled y the customer service professionals during busy hours has increased tremendously eventually increasing the company’s sales and thus the revenue of the entire organization. The increase in the revenue apparently provides higher return on investment within the organization.
Also the increase in the efficiency of the system apparently increases the reliability of the system thus gaining customer loyalty and competitive advantage in the market. By accomplishing this endeavour, the company can increase the geography of its target market and thus increase its market share as well as develop into new areas of business.
The major limitation involved with the above exercise is the withholding of the company name. The participating organization was not willing to reveal its identity in the report even though it was clearly verified that the research is for academic purposes only.
Also, the end user licence and other issues related to software licensing and commercial usage are not captured within this research due to the time limitations and the word limit of the report.
Objective 1: To conduct a generic overview on Quality of Service and identify its critical nature in information technology service environment with academic resources.
The literature review in chapter 2 provided a comprehensive overview on the concept of Quality of Service in business environments with respect to technology implementation. The literature review also established that the Quality of Service in an information technology perspective is not only in terms of accuracy and punctuality but essentially the aspect of accuracy should also be achieved.
Objective 2: To investigate into the quality of service in an IT environment through a case-study analysis using secondary resources like journals and company Profiles.
The secondary research in chapter 3 presented a vivid analysis on the critical areas of information technology where the Quality of Service can be achieved effectively and quantified easily for business purposes. The research has also justified that the Quality of Service in the IT-based implementation is not just achieving the agreed output but mainly to provide backend support for the day-to-day business itself in order to enable the client organization to gain competitive advantage in the target market.
Objective 3: To conduct a primary research through implementing a windows 2000 server installation in a local area network and configuring MS SQL Server 2000 RDBMS system in the network.
The configuration exercise in chapter 4 has revealed that by the efficient use of the technology and adhering strictly to the agreed targets (including time deadlines) an IT-solutions organization can leverage Quality of Service whilst the client company can gain competitive advantage and increase Quality of Service in its business through the efficient use of the installed technology in its company.
From the analysis it is clear that information technology has evolved from just a technical element of a business to an integral part of the business to achieve competitive advantage. It is further clear that the IT-based systems are expected to provide high Quality of Service due to the heavy investments involved in the development and installation as well as maintenance of the project. It is also clear that both the secondary research s well as the primary research are in tandem with each other and the results achieved are coherent in nature to make a concrete decision based on the research.
Hence to conclude the dissertation it is clear that the Quality of Service is an essential element for the success of IT vendors as well as the client organizations deploying the technology so developed. The critical areas for achieving Quality of Service in an IT environment are listed below
1. Efficient Planning of the project and agreeing the deadlines and individual deliverables within the project.
2. Strict Adherence to the plan by adapting to specific methodologies of project development and management like PRINCE2.
3. Efficient testing of the developed product for its usability in the real world and
4. Creating user friendly and speedy IT systems that eliminates redundancy and provides consistent output every time it is put to use.
5. Providing efficient support by reducing the system outage levels and increasing the productivity of the IT project as well as the entire organization.
Since the area of information technology is vast and a comprehensive coverage of the QoS across the entire IT sector could not be accomplished within the word limit of this report. Hence it is recommended to conduct future research on a specific segment of IT to derive comprehensive results.
Alongside, the implementation and configuration in the primary research was also conducted on a limited basis due to lack of resources and finance for an enterprise wide pseudo implementation. It I hence recommended to conduct a pseudo configuration on a bigger network as well as by incorporating more number of elements in the hardware and software to gain concrete results that can be extrapolated for conducting trend analysis and forecasting of the life of the product.
The topic of my dissertation seemed easy but only at first glance - I couldn't sleep well any more. I was stressed and I felt broken. Phdify saved me from a total disaster, and now I have my PhD.
Most friends of mine encountered the same difficulties. I wrote some chapters by myself, but another chapters were moving on slowly! So, I never hesitated to ask for a help and I've got a great experience at phdify.com!
At one moment I felt an absolute despair to finish my thesis! To my luck a good friend of my gave me this site, and I understood: this is my salvation! Thanks to Phdify team I finished my thesis in time!