upload
Gartner, Inc.
Industry: Consulting
Number of terms: 1807
Number of blossaries: 2
Company Profile:
Gartner delivers technology research to global technology business leaders to make informed decisions on key initiatives.
A database appliance is a prepackaged or pre-configured, balanced set of hardware (servers, memory, storage and input/output channels), software (operating system, database management system (DBMS) and management software), service and support. It is sold as a unit with built-in redundancy for high availability and positioned as a platform for DBMS use in online transaction processing (OLTP) and/or data warehousing (DW).
Industry:Technology
Database activity monitoring (DAM) refers to a suite of tools that can be used to support the ability to identify and report on fraudulent, illegal or other undesirable behavior, with minimal impact on user operations and productivity. The tools, which have evolved from basic analysis of user activity in and around relational database management systems (RDBMSs) to encompass a more comprehensive set of capabilities, such as discovery and classification, vulnerability management, application-level analysis, intrusion prevention, support for unstructured data security, identity and access management integration, and risk management support.
Industry:Technology
Data wiping is the process of logically removing data from a read/write medium so that it can no longer be read. Performed externally by physically connecting storage media to a hardware bulk-wiping device, or internally by booting a PC from a CD or network, it is a nondestructive process that enables the medium to be safely reused without loss of storage capacity or leakage of data.
Industry:Technology
A data warehouse is a storage architecture designed to hold data extracted from transaction systems, operational data stores and external sources. The warehouse then combines that data in an aggregate, summary form suitable for enterprise-wide data analysis and reporting for predefined business needs. The five components of a data warehouse are: production data sources data extraction and conversion the data warehouse database management system data warehouse administration business intelligence (BI) tools A data warehouse contains data arranged into abstracted subject areas with time-variant versions of the same records, with an appropriate level of data grain or detail to make it useful across two or more different types of analyses most often deployed with tendencies to third normal form. A data mart contains similarly time-variant and subject-oriented data, but with relationships implying dimensional use of data wherein facts are distinctly separate from dimension data, thus making them more appropriate for single categories of analysis.   SummaryArticle NameData Warehouse DefinitionGartner, Inc.Gartner, Inc.DescriptionA data warehouse is a storage architecture designed to hold data extracted from transaction systems, operational data stores and external sources. The warehouse then combines that data in an aggregate, summary form suitable for enterprise-wide data analysis and reporting for predefined business needs.
Industry:Technology
A form of embedded middleware that allows applications to update data on two systems so that the data sets are identical. These services can run via a variety of different transports but typically require some application-specific knowledge of the context and notion of the data being synchronized.
Industry:Technology
The data scientist role is critical for organizations looking to extract insight from information assets for “big data” initiatives and requires a broad combination of skills that may be fulfilled better as a team, for example: Collaboration and team work is required for working with business stakeholders to understand business issues. Analytical and decision modeling skills are required for discovering relationships within data and detecting patterns. Data management skills are required to build the relevant dataset used for the analysis.
Industry:Technology
The data replication segment includes a set of data replication products that reside in the disk array controller, in a device in the storage network or on a server. Included are local and remote replication products, migration tools, and disk imaging products. Also included are replication products specifically targeted as an alternative to backup applications. Not included are database replication products, log-based DBMS replication products or application-based replication products.
Industry:Technology
The market for data quality tools has become highly visible in recent years as more organizations understand the impact of poor-quality data and seek solutions for improvement. Traditionally aligned with cleansing of customer data (names and addresses) in support of CRM-related activities, the tools have expanded well beyond such capabilities, and forward-thinking organizations are recognizing the relevance of these tools in other data domains. Product data — often driven by MDM initiatives — and financial data (driven by compliance pressures) are two such areas in which demand for the tools is quickly building. Data quality tools are used to address various aspects of the data quality problem:  Parsing and standardization — Decomposition of text fields into component parts and formatting of values into consistent layouts based on industry standards, local standards (for example, postal authority standards for address data), user-defined business rules, and knowledge bases of values and patterns  Generalized “cleansing” — Modification of data values to meet domain restrictions, integrity constraints or other business rules that define sufficient data quality for the organization  Matching — Identification, linking or merging related entries within or across sets of data  Profiling — Analysis of data to capture statistics (metadata) that provide insight into the quality of the data and aid in the identification of data quality issues  Monitoring — Deployment of controls to ensure ongoing conformance of data to business rules that define data quality for the organization  Enrichment — Enhancing the value of internally held data by appending related attributes from external sources (for example, consumer demographic attributes or geographic descriptors) The tools provided by vendors in this market are generally consumed by technology users for internal deployment in their IT infrastructure, although hosted data quality solutions are continuing to emerge and grow in popularity. The tools are increasingly implemented in support of general data quality improvement initiatives, as well as within critical applications, such as ERP, CRM and BI. As data quality becomes increasingly pervasive, many data integration tools now include data quality management functionality.
Industry:Technology
Data quality software as a service (SaaS) refers to data quality functionality (such as profiling, matching, standardization and validation) delivered using a model in which an external provider owns the infrastructure and provides the capabilities in a shared, multitenant environment used by its customers. Various offerings for data quality SaaS can be used as alternatives to in-house deployments of data quality software tools or the development of custom-coded solutions.
Industry:Technology
Data profiling is a technology for discovering and investigating data quality issues, such as duplication, lack of consistency, and lack of accuracy and completeness. This is accomplished by analyzing one or multiple data sources and collecting metadata that shows the condition of the data and enables the data steward to investigate the origin of data errors. The tools provide data statistics, such as degree of duplication and ratios of attribute values, both in tabular and graphical formats.
Industry:Technology