ⓘ Big data ..

Data philanthropy

Data philanthropy describes a form of collaboration in which private sector companies share data for public benefit. There are multiple uses of data philanthropy being explored from humanitarian, corporate, human rights, and academic use. Since introducing the term in 2011, the United Nations Global Pulse has advocated for a global "data philanthropy movement".

Administration Data

Administrative data are collected by governments or other organizations for non-statistical reasons to provide overviews on registration, transactions, and record keeping. They evaluate part of the output of administrating a program. Birth and death records, regulating the crossing of people and goods over borders, pensions, and taxation are examples of administrative data. These type of data are used to produce management information, like registration data in a cost-effective way. This enables administrative data, when turned into indicators, to show trends over time and reflect real wor ...

Big Data Maturity Model

Big Data Maturity Models are the artifacts used to measure Big Data maturity. These models help organizations to create structure around their Big Data capabilities and to identify where to start. They provide tools that assist organizations to define goals around their big data program and to communicate their big data vision to the entire organization. BDMMs also provide a methodology to measure and monitor the state of a company’s big data capability, the effort required to complete their current stage or phase of maturity and to progress to the next stage. Additionally, BDMMs measure a ...

BisQue (Bioimage Analysis and Management Platform)

BisQue is a free, open source web-based platform for the exchange and exploration of large, complex datasets. It is being developed at the Vision Research Lab at the University of California, Santa Barbara. BisQue specifically supports large scale, multi-dimensional multimodal-images and image analysis. Metadata is stored as arbitrarily nested and linked tag/value pairs, allowing for domain-specific data organization. Image analysis modules can be added to perform complex analysis tasks on compute clusters. Analysis results are stored within the database for further querying and processing ...

Burst buffer

In the high-performance computing environment, burst buffer is a fast and intermediate storage layer positioned between the front-end computing processes and the back-end storage systems. It emerges as a timely storage solution to bridge the ever-increasing performance gap between the processing speed of the compute nodes and the Input/output bandwidth of the storage systems. Burst buffer is built from arrays of high-performance storage devices, such as NVRAM and SSD. It typically offers from one to two orders of magnitude higher I/O bandwidth than the back-end storage systems.

Continuous analytics

Continuous analytics is a data science process that abandons ETLs and complex batch data pipelines in favor of cloud-native and microservices paradigms. Continuous data processing enables realtime interactions and immediate insights with fewer resources.


ⓘ Big data

  • Big data is a field that treats ways to analyze, systematically extract information from, or otherwise deal with data sets that are too large or complex
  • Industrial big data refers to a large amount of diversified time series generated at a high speed by industrial equipment, known as the Internet of thingsThe
  • Big data ethics also known as simply data ethics refers to systemizing, defending, and recommending concepts of right and wrong conduct in relation to
  • Dangerous is a song by American electronic music project Big Data from their debut EP 1.0 2013 and their debut studio album 2.0 2015 It features
  • Big Data is an American electronic music project created by producer Alan Wilkis. Big Data is best known for its single Dangerous featuring Joywave
  • The Big Cartoon Data Base or BCDB for short is an online database of information about animated cartoons, animated feature films, animated television
  • Big Data Maturity Models BDMM are the artifacts used to measure Big Data maturity. These models help organizations to create structure around their Big
  • structured and unstructured data Data science is related to data mining and big data Data science is a concept to unify statistics, data analysis, machine learning
  • ECL is a declarative, data centric programming language designed in 2000 to allow a team of programmers to process big data across a high performance
  • The Oracle Big Data Appliance consists of hardware and software from Oracle Corporation sold as a computer appliance. It was announced in 2011, promoted
  • pixel is placed. Data streams are useful for data scientists for big data and AI algorithms supply. The main data stream providers are data technology companies
  • technologist at HP s Big Data Business Unit, discussed one of the more controversial ways to manage big data so - called data lakes. Are Data Lakes Fake News