Understanding Block Chain Technology

In recent times, block chain technology redefined the Internet and led to the emergence of a new type of internet where digital information is distributed without copying. The technology was conceived and devised primarily for crypto currencies, digital currencies like the Bit-coin. In contemporary times, bit-coin is referred to as the digital gold and the total value of this currency is close to about 9 billion US dollars. Block chain technology can make other type of digital values. The functioning of the technology is encapsulated and therefore the user can use it without having to know it in detail. However, it is always recommended to have a basic notion about the technology in context before using it as this adequately simplifies the use.

The functioning of the technology is pretty much encapsulated implying that there is no need to know about the working of the block chain technology in detail, a fundamental idea about the working of the technology is more than sufficient for individuals using it. In simpler terms, this technology can be defined as a digital ledger of commercial transactions which is incorruptible and can be programmed to record not just the financial transactions but anything which has value associated with it.

Information stored as part of the technology in context is quite similar to the same in a spread sheet or any distributed database. Just as a spread sheet containing values can be regularly updated, the block chain too can be updated from time to time. The records stored using the block chain technology are not kept in a private location, instead, such data source are kept in public domain so that they can be verified on a timely basis. Using such a technology, the information is not held by any centralized servers instead they are stored in several database servers across millions of workstations, computers that are connected to the internet. It is because of this that the block chain data cannot be hacked or corrupted.

Since in this technology the blocks of information can be assessed across more than one point in the network therefore it cannot be controlled by a single entity. Since there are multiple copies of block chain information available across networks therefore such technologies do not have a single point of failure. Other facets of this technology are that it is transparent, incorruptible, very much decentralized because the information related to the technology in context is stored in several host machines across the network and all these factors contribute to making the block chain network highly secure.

All About Data

Information Technology-‘IT’-is expensive. Every CEO, CFO, COO-virtually every manager at every level of every company in existence knows that irrefutable fact. It’s usually a major line item in every company budget.

For the IT department, managing the company technical capability means first finding a product that meets whatever need the company has come up with (and sometimes the odd whim of a single manager), then the purchase, installation, training, and maintenance of the hardware, software, networks, and databases that go along with the tool. Then you get to worry about interfaces between the tools, reports, security (both internal and external), and the wonderful, irrepressible, eccentric, and oftentimes destruction habits of the end users even after you’ve provided complete end-to-end training on the new product.

Then someone changes their mind and you get to do it all over again.

I personally believe every metropolitan area should have a private asylum for IT managers. This facility should come complete with areas for the temporarily insane to conduct personal and violent destruction of computer hardware, provide them a setup of punching bags made to look like ignorant (not stupid, just unrealistic) company officers, and an additional stomping ground full of dummies made to resemble a variety of knuckle-headed end users. And a special place of hell for hackers…

Such a facility would be filled to capacity at all times.

The madness doesn’t stop with the IT department. For other company managers, IT changes means hours or days of training, downtime and loss of productivity that comes with IT problems, failures, or system upgrades.

For sales staff, an IT glitch can mean lost opportunity, loss of revenue, and a less than stellar image of the company that can stay in a customer’s mind for years. Salesmen may never overcome a bad customer experience generated by failed IT. The phrase “the computer is your friend” is not widely spoken among sales people.

But IT is a necessary evil, isn’t it? What company could function without it?

Well, it is necessary. Even an enterprising young man or woman entering the workforce for the first time mowing yards needs a way for customers to reach him/her, a way to manage a schedule, maybe even a way to track who has paid their bill.
But does IT have to be evil?

What if the evil-ness comes because we’re trying to solve the wrong problem with the IT? We’re trying to force a round peg into a square hole, imagining IT can solve our problem without actually identifying what the problem is? We buy computers, networks, communications, and all sorts of things to do one thing-capture and manage information. A simple fact that we inherently all know but gloss over-it isn’t the IT that’s important, it’s what’s traversing that IT.

It’s about the data. Those little bits and bytes that make up characters that make up data elements that coalesce into information that gives us knowledge that further transforms into intelligence that can be used and acted upon.

It’s about the data. Yet we chase the tools driving the data.

But wait (you say)! We have our databases. That’s part of the IT. That’s where our information is stored. We need the IT to get to our information. It’s OUR information.
Well, yes… sort of.

But not really. Your company does store data into the databases related to your company, usually inside a proprietary database that is part and parcel to the software you’ve purchased. Most-or at least a great deal-of that data is duplicated in other systems, some internal to your company, but most certainly in some other external system. And setting up a database is hard work, what with getting items identified, parsed, moved into the proper fields, verified, and such. It takes time and manpower which translates into dollars spent.

And how easy is it to get it back out of that database once you’ve decided to move on to the next cool IT product? How many IT managers work an exit strategy at the same time they’re developing their acquisition strategy? If you buy a proprietary product, do you know what data rights you have and how you’ll get out of that product when the time comes? Because it will. (By the way, the answer is generally ‘no’-it’s hard enough to get the product up and running while telling your vendor and company leadership that you’re already planning (and spending funds on) its demise.)

OK, so who has the data (where is your database actually located and who manages it? Who has access? Who owns it (don’t make an assumption here))? How will it be delivered back so you can move it to a competitor’s platform? Do you need to buy a proprietary (meaning, expensive) tool to extract the data? Who handles problems? Who maintains the documentation over the years so you actually know what that database looks like and exactly what each element means (because that changes too)? This borders on the geeky but X may not always and forever mean X, or maybe now its X+2. Maybe X is now alpha-numeric whereas it started out as numeric only. This information is absolutely vital-what changed and when? Without that documentation, you have no way of knowing if your data is complete, if its actually correct or if its been corrupted.

Love your database managers.

So, back to who actually owns the data. Even if your imminently wise IT manager has the bases covered as far as database ownership and all, do you actually own the data elements?

No. You own the intelligence that comes from using the data, and any subsequent storage and retrieval of that intelligence, but you don’t really get to decide the data elements that comprise that intelligence.

For instance, the US Social Security number. The US Government owns it-its structure and rules, and the content assigned per individual. Your company has no say in the matter. It can, however, be used in multiple ways. Some IT systems use all nine characters-with or without the dashes-while some only keep the last four, six, or seven. Other countries have personal identification numbers that look nothing like the US SSN. Now what?

How long is a ‘name’ and who gets to decide what it looks like (no system I’m aware of could capture the symbol Prince used for a while)? How long can a name be? What special characters are allowed? How many names can one person have (first/last/middle or 6-7 names, Aliases, Previously Known As)?

Within countries, some will say the government owns much of the other personally identifiable information (generally abbreviated to “PII”) for that country (probably the US among them). There may even be some international consortium that believe they ‘own’ data related to their field of expertise (but I bet there’s other consortium that would disagree with that position).

We could go on. The point is that no one ‘owns’ a data element, at least nothing that’s agreed upon globally, and that’s a problem.


Because we are global creatures living as members of a global environment. No man (or country) is an island. Data travels around our world at the speed of thought through social media and interconnected systems. It’s perpetual-once a ‘thought’ is out there, it’s out there for good because somewhere it’s been captured by an IT ‘system’.

Data is accessible from virtually anywhere and we can learn something about anything with a few key strokes (although we have no way of knowing the veracity of what we find).

So, stuff is out there in a plethora of forms, some of it is correct, some of it isn’t, and you need special tools to get much of it.

How do we know what we know? Personally, I think this era will eventually be known as the Second Dark Ages because we don’t know what we know and have no way to capture (into perpetuity) our knowledge. Or the trail of emails, notes, memos, etc that tells how we came to that knowledge, why we made that decision, why that particular path was chosen, etc.

PCs, laptops, mobile devices contain a wealth of information that belongs to an individual or often, a company or organization. When that device goes to the Great Recycling Bin in the Sky, usually through a fried hard drive which makes the data it contains inaccessible, all that data is lost. Fade to black.

I attended a lecture once that said in 1900 human knowledge was doubling every 50 years. In 1950 it was every 25 years, in 1998 (when I heard this) it was every 10 years, and by 2020 it would be every 72 days. Say what? How do we capture that? How do we know what we know when it’s all captured in disparate databases, disparate devices, in different forms?

How on Earth do we manage all of this data/information/knowledge/intelligence?
We need help. We need the computers to help us. As in, Artificial Intelligence (AI). AI could help us make sense of all of it, except the ‘all of it’ is scattered and parsed all over the world without any standard form or organizational structure.

So-what if we stopped driving the IT and instead drove the data (which is what we want anyway)? Just suppose we got control of our data, controlled it, and standardized it across the globe?

Imagine it-data element X looks like this, means this, is accessed by this nomenclature, owned (controlled) by this organization and (maybe) even updated this way. IT could do whatever it wanted with it as long as it didn’t change the structure of the element!

It wouldn’t matter what IT tool we used-whatever suited our needs and budget-because our data was stand-alone and controlled like the Borg-collective. IT cannot change the structure or meaning of the data. Resistance is futile. Companies wouldn’t have to spend millions of dollars defining and documenting their database because it would be standardized. They would need only to define the data elements they’re interested in. A vendor developing a new IT tool wouldn’t have to modify their tool for every customer-the data routines would be standard (think Services Oriented Architecture on steroids).

Wow! But how? How would be go about locking down data?

It would take a global endeavor, probably something under the United Nations.
Suppose there was a group that took care of everything associated with Personally Identifiable Information, another for education, another for health, another for accounting, etc etc etc…

It’s mind-boggling. There’d have to a group just to decide which group a piece of data should be sent to for management (is ‘checking account’ part of banking, accounting, personal, or business information?)

There would be arguments.

It’s been tried before of course, on much smaller scales by small organizations. None have been successful primarily because the group didn’t actually own the data. You can’t control what you don’t own.

I argue here, however, that it is no longer a matter of choice. If we are to avoid becoming that Second Dark Ages, we must find a way to get control of our data and it must be a universal endeavor.

Start small, think big, move fast…

Why not start with a group dedicated to individual attributes? Define those parameters. Establish the data structure. Define the tools to get to, update, and terminate that data. Then move out from there.

Imagine a world where companies or individuals can buy any IT product off the shelf-without the current bureaucracy associated with major purchases-whenever they want, with whatever bells and whistles they want, because the data it uses is universally structured. It would save billions of dollars (after the Great Data Structure in the Sky was set up anyway).

Imagine a world where data is separate from the IT driving it. You can have a database but you can’t change the structure or meaning. Some elements may be un-updateable except by an authorized group (e.g., source data like date of birth) and a master version captured at whatever level deemed appropriate-maybe national then synchronized with a world-wide repository. You can coalesce the data into information and intelligence that (may) become another piece of information and captured it its own right but you do NOT update the ‘truth’ data element.

Yes, I know-take two aspirin and think it through… it will hurt the gray matter. It’s easier to chase the IT than get a handle on the data, which is why we go that route. But we have to start.

Benefit of 3D Printing

My father’s dream was to build his own house.

However, building a house in India was an incredibly complicated affair.

First, you had to buy a plot of land. Next, you had to get an allocation from the government for all the materials you needed to make a house – cement, steel rebar, piping, you name it.

It was all in short supply because of India’s socialist economy, along with a quota system to dole out the limited quantities then available. And then to get your water, sewage and electricity connected… that was yet another ordeal.

Simply getting all this together was an enormous effort that took years. Of course, there was an easier way…

You hired a “fixer.”

In India, and anywhere else with a dysfunctional bureaucracy, a fixer is someone who knows all the right people. He greases someone’s palm over here and trades a favor with someone else over there to get things done.

With a fixer, what might have taken two or three years instead took just a few months.

Still, with all that, the house my father built took years to finish. However, today, a new technology is emerging that can shrink build times and make constructing a home much cheaper.

The Incredible Costs of Rebuilding

That technology is 3-D printing.

And this technology may suddenly become mainstream because of the incredible damage to housing that Hurricanes Harvey, Irma and Maria have done in 2017.

Harvey is estimated to have completely destroyed 12,700 homes.

Irma is estimated to have destroyed 25% of all homes in the Florida Keys.

Maria is estimated to have caused damage worth as much as $30 billion across the Caribbean.

Dominica, an island that I’ve been to go hiking and canyoning, experienced a near 100% loss of houses and buildings. It’s unlikely that Dominica can afford to reconstruct itself using the old-fashioned, traditional way of building houses and buildings. It would cost too much money, and it would take too long.

However, Cazza, a 3-D printing company, could have a solution.

Using Cazza’s X1 robot, 3-D printed buildings like houses, villas, shelters, warehouses and commercial buildings can go up in as little as one week.

Cazza believes that using its 3-D printing technology will save as much 40% on the old, traditional ways of building.

That’s a $20,000 savings on a house that costs $50,000 to put up. And remember, the 3-D printed model gets you your house in a week instead of months or years.

The current estimate for the still-incomplete hurricane season is already as much as $340 billion.

If you assume that 30% of this damage is destroyed homes and buildings, implementing 3-D printing technology like the Cazza X1 to rebuild will save as much as $40.8 billion. That’s a big deal.

This is why I’m watching the 3-D printing technology used on homes and buildings carefully. Because, in time, the techniques used to rebuild from disasters are also going to be used to make regular homes less expensive too. And the company that makes the 3-D printing technology is going to have a stock that soars for years.

All About Cloud Computing

Cloud computing would be a clever and efficient way of sharing software systems through technology in the cloud rather than have individual copies of everything. Simply put, we do possess the Facebook app but simply access it online.

It is quite certain that computer networks and the wonders of the internet have permeated into our bones when most of us live, eat, study and work computers. Each day seems to bring something startlingly new in the fabulous world of technology whether it applies to education, smartphones, cameras or automobiles. Space technology, medicine, and weapons are all in the race to elevate their systems by means of superior technology.

Usage of cloud computing

Can you feel the enormous weight of all that Aadhaar or census data that deals with over a billion people? Can hundreds, thousands of computers handle all that information? While it is true that hard disks can contain thousands of books, where is it all going to end?

It is a profound mistake to think that the earth’s resources would never end. Maybe our ancestors thought that way or could not imagine the extent we would use up all the resources. The time will come one day when we would have used up all the natural resources that now exist and seem infinite like the sun. The sun itself would burn out and life on earth would cease! Yet that day is too far away to start worrying now!

Like the way we use the electricity grid, paying for what we consume, the cloud contains all the elements that are metered and payment is made according to the extent of usage. There is no need to replicate software and extensive hardware devices with each user. Costs thus get shared like using pool vehicles instead of the five of them driving to work in separate vehicles!

The cloud facilities could be a private cloud completely owned by one organization for its own use, a public service provides facilities over a network or a hybrid cloud contains several kinds of services.

Have you heard about IaaS, SaaS, PaaS and UCaaS? The “S”, at the end stand for services and the initial letters stand for Information, Software, Platform and Unified Communications. A range of services are dispensed from the cloud where the organization and management is left in the hands of the Cloud that merely dispenses. The users would access the services through smartphones or laptops just like the present and sign up the services they wish to use just like we opt for the channels we want with the cable operator.

Benefits of cloud computing

The experts believe that such an arrangement would bring tremendous benefits as compared to the present. The greatest advantage perhaps is security though we may be worried about sensitive data. The advantages of a large organization are many in terms of a shared strength like an army of people and systems.

The freedom of accessing the cloud from any point on the earth through the browser is a mighty advantage and it does not really matter what device you are working from. Costs and maintenance become simpler because you have centralized systems and software that avoids unnecessary duplication with each user.

Performance, productivity, and reliability are ever so much enhanced under a large umbrella of experts and the user has little to worry about except to connect and work or play. Three cheers for the cloud computing of the future!