top of page
Search

"Medium Data"

Peter Kam Fai Cheung SBS

"Big Data is inherently problematic as it is over-sized. " I remarked in a talk on the many pitfalls of Big Data. "If Big Data is about machines and Small Data is about people, then their synthesis should be 'Medium Data' interfacing machines with humans". I argued, while others seemed to be amused with the new term I coined.

Without any data, whether small or big in any context, it would not be possible for machines or humans to make informed decisions as to the path ahead. If the data is small, any associations or patterns might be deficient. If the data is big, making sense of its excessive volumes without falling into some logical fallacies such as "false cause" is hard.

In my view, the social science among statistics, economics, machine-and-human learning can be the Aristotelian Mean. Robots and personnel can use "medium data" to discern interests or behaviours and their contextual value. Data processing tools like analytics and algorithms structured along the Mean model should enhance confidence level.

Data regarding passionately vocal minority should be processed in the disinterested-silent-majority context. I hypothesize that computing the "Medium Data" should make the findings statistically significant. I believe, the "Medium Data" synthesized by machines and humans can describe what, explain why and predict the reasonable future!

8 views0 comments

Recent Posts

See All
bottom of page