Skip to main content

Posts

Showing posts from January, 2017

Tech Archaeology: Unearthing the Artifacts of a False Prediction

Greetings. This is going to be a shorter rant. New year, new me! Anyway, I was inspired to write this after I caught myself falling into a usual habit: investigating the validity of a prediction which claims that a technology (it could be anything) will take over in the future. I'll start from the beginning. It all started when I was dutifully studying for my Databases class. While reading the textbook, Database Processing  (13th edition) by Kroenke and Auer, I came across a passage that was summarizing the history of database processing. Being that this book first came out around 1977, it has probably witnessed very few shifts  in the popularity of database technology over its existence; namely, the rise of Relational Model and its subsequent dominance. Never-the-less, in a table that describes the emergence of database technology, there is a row for the "XML and Web Services" era (after "Open-Source DBMS" and right before the "Big Data and NoSQL"

Parallelism and Task-Decomposition: An Introduciton

Introduction Since I’m on holiday break from University, I’ve had time to begin investigating parallel processing. I’m going to try and share a little of what I’ve learned about the technology and how programming languages can leverage multi-core CPUs and GPUs. I’ll finish up and explain a bit about task-decomposition , which is an important aspect of writing algorithms intended for parallel processing. My investigation began one night as I was cooking dinner and watching a talk by Rich Hickey, the inventor of Clojure, titled “Are we there yet?” . At one point, while Mr Hickey was discussing parallel processes, the question of “how does a GPU process pixels?”popped into my head. Surely that must be parallel, right? Since my “Intro to comp-sci”course last semester didn’t get into graphics pr