Blockchain

Blockchain in simple terms is a chain of blocks. In computer science terminology it is a singly linked list. Each block stores a pointer to previous block along with some data. And hence the name.

How is Blockchain stored?

Blockchain is a distributed ledger comprising of several computers referred as “nodes”. Each node stores entire copy of the chain. Each such copy is independently verifiable. Whenever a new node joins the network, it receives entire copy of the chain.

Characteristics of Blockchain:

  • Distributed – As mentioned above, there is no single node which is controlling the blockchain. As a result this makes it a distributed system.
  • Robustness – Since it is a distributed system, it gives robustness to the system. Failure of any single node doesn’t make the system inaccessible, unusable or unstable.
  • Secure – The data in each block could be encrypted. Each block maintains the reference to previous block. The reference for the current block is derived by using encryption algorithm i.e. hashing the data. As a result, if the data inside the block is changed, the hash also changes. This in turn invalidates rest of the chain on that node. In a distributed system, such nodes get rejected and hence once your data is added to blockchain it is nearly impossible to modify that record.
  • Transparent – The transparency is only to the extent that every node has a complete copy of the chain. However, data inside each block could be encrypted and hence not every node would be able to read the data.

But what is the use of blockchain?

It has several possible uses. One which currently a buzzword is – Bitcoin – cyrptocurrency. More about this sometime later.

Among other uses, one can use this technology to maintain land records or medical history, audit trail or insurance claim. Typically, where you need to ensure sanctity of entire record, you could put blockchain to use.

Related Links

Related Keywords:

Cryptocurrency, Bitcoin, Ethereum, Cryptography

API Gateway

API Gateway is a server that is a single entry point to your system. As you can imagine, primary function of this server is provide various API endpoints to various clients. It also hides the backend services from the client.

Why should I use API Gateway?

Consider a complex architecture where variety of clients are accessing your system. In this age, monolithic applications are getting outdated whereas microservices based applications are popular because of their efficiency and maintainability. As a result, all of the clients will need to make several calls to get data from various services from your system. So even to render one page, client will end up making several calls. This would be a problem on mobile network, which typically has latency issues. To add to this complexity, each client could have their own requirements. That could also mean client specific code.

All these problems can be solved by implementing an API Gateway. This server provides single point of reference for all the clients. It detects the client and breaks the request into multiple backend requests to fetch the data. This gives additional benefit of consolidating responses from all the backend requests into single response for the client.

API Gateway simplified diagram
API Gateway [Source: https://www.nginx.com/blog/building-microservices-using-an-api-gateway/]

Are there any drawbacks?

Yes, as is the case with almost everything in life! If due care is not taken, API Gateway itself can become bottleneck and heavy. All developers would be required to update API Gateway whenever they make changes to the endpoints or protocol to access their respective services. API Gateway gathers data from multiple webservices. As a result, failure of even one service could lead to unavailability of entire service. Or it could add delays in response to the client.

However, there are already ways to counter these drawbacks. Proper process such as DevOps could lead to removal of API Gateway becoming developer bottleneck. Usage of circuit-breaker libraries such as Netflix Hystrix could avoid overall service breakdown even in case of partial service outage.

Reference Links:

Example Implementations / Providers:

  • AWS API Gateway
  • Azure API Management
  • Vertx
  • JBoss Apiman

Related Keywords

Microservices Architecture, AWS, Azure, Hystrix

Supervised Learning

Supervised Learning is a methodology in Machine Learning field. In this methodology, an algorithm is developed based on known dataset and known observations from that dataset. Once the algorithm is stable, researchers / developers use it on new but similar dataset to get the observations about that dataset.

In this method, the known relationship between the dataset (training data) and observations (outcome) helps the algorithm to improve. This is kind of a teacher supervising students in learning new technique. And hence this method is referred as “Supervised Learning”. The developer keeps on improving the algorithm until it reaches fairly accurate outcome for all of the training set.

When to use Supervised Learning?

You have a training data available/gathered. After manual analysis, you know the expected outcome. And then you are required to find out the outcome on another similar dataset for which outcome is not yet available. This is an ideal condition to use Supervised Learning.

Tasks involved in Supervised Learning

Typically there are two types of tasks involved in this type of learning.

  • Classification: In this case, the algorithm assigns a category to the input dataset. e.g. If the training dataset is a set of files, this algorithm will categorize each file as text file, image file or binary file.
  • Regression: In this task, algorithm will predict a numerical value based on training dataset.

Developers often need to consider “bias vs variance trade-off” while determining the accuracy of the algorithm. Sometimes the algorithm consistently produces incorrect output for given input. This is referred to as “bias”. Sometimes, algorithm produces different values for same input. This is called as “variance”. It is usually impossible to have lowest of both bias as well as variance and hence a balance of these two are required. When such balance is reached, developers can start using that algorithm on different datasets and continue to improve.

Example of Supervised Learning:

Let’s say you have 20 photos and each of them are tagged with labels such as person name, location, type of photo. In this case you will develop a model using this information. Once done, you can feed another 20 photos to this model and see if model has “learnt” from earlier dataset.

Fun Fact:

You can find information on Facebook photos “alt” tag – “Image may contain: mountain, sky, outdoor” OR “Image may contain: One Person, Standing, outdoor” etc. This looks like AI running on the photos through Supervised Learning model.

Reference Links:

Related Keywords:

Machine Learning, Unsupervised Learning, Semi-supervised Learning, Active Learning

AI – Artificial Intelligence

Artificial Intelligence or AI can easily be described as intelligence which is not demonstrated by natural entities such as humans and other animals. AI is intelligence displayed by machines. AI is a branch of computer science that deals with making machine behave – think and act – like humans.

Humans have been trying to make machines smarter and we are succeeding a lot than ever. Every passing day, we are making machines which are smarter than yesterday. As a result scope of AI also keeps on changing. To be apt – “AI is whatever that hasn’t been done yet”

Scientist John McCarthy coined the term “Artificial Intelligence” in 1956. He also devoted lot of his time in this field in addition developing language “LISP”. Due to his contributions to this field, he is referred as “Father of AI”.

Typically AI researchers focus of aspects such as reasoning, knowledge, planning, learning, NLP and ability to move and manipulate objects. They have created various languages which support their goal – LISP, Prolog, Python.

Applications of Artificial Intelligence

  • Advanced Weather Modelling
  • Self Driving Cars
  • Pattern Recognition
  • Predictive Analytics by looking at large datasets
  • Automated Traffic Signal system which automatically changes signal duration based on the traffic density

Machine Learning and NLP are at the heart of Artificial Intelligence. Scientists and researchers are striving to make machines as intelligent as humans are. Or may be even more intelligent. Some breakthroughs in this direction are:

Humanoid - Sophia - Artificial Intelligence
By International Telecommunication Union [CC BY 2.0 (http://creativecommons.org/licenses/by/2.0)], via Wikimedia Commons
The recent advances in this field are making Artificial Intelligence a very happening field. Focus of next decade will surely be on this field. Surely we would see lot more breakthroughs in this field.

Subscribe now. You will get relevant updates frequently. Also, please check our other terms from menu.

Related links:

Related Keywords:

Machine Learning, NLP, Deep Learning, Neural Network, LISP, Prolog, Python

Graphic Processing Unit (GPU)

Graphic Processing Unit or GPU as we know it for several years now, has gained a new importance in the light of Artificial Intelligence. GPU is a specialized circuit designed to manipulate memory rapidly to create faster images.

GPUs are highly efficient in manipulating computer graphics and image processing. However they have now gained importance due to their efficiency in fast computing in a parallel fashion. GPU has parallel architecture consisting of several thousand smaller yet efficient cores designed to handle multiple tasks simultaneously. This is where they differ from CPU, which has only a few cores designed for sequential processing.

GPU in Artificial Intelligence

GPUs have been found to be tremendously powerful as compared to CPUs. In one of the project, 12 NVIDIA GPUs delivered deep-learning performance of 2000 CPUs. That is phenomenal! NVIDIA GPUs are speeding up the DNNs (Deep neural Networks) by 10-20x, resulting in reduction in the training times for the Artificial Intelligence. NVIDIA has also provided rich platform for developers (CUDA) which improved developers’ productivity helping them innovate quickly.

Other Uses of GPU

We had known GPU long only for their graphics related use such as gaming. Several gaming consoles were powered by GPUs. However, as explained above GPUs are now very popular in the field of Artificial Intelligence. They have also been extremely useful and popular in several other areas such as:

  • Self Driving cars – to train the algorithm to detect the vehicles even in difficult conditions
  • Healthcare and Life Sciences – deep genomics studies
  • Robots

It is evident that the parallel processing that GPUs offer are going to dominate the near future and can be seen from the investor interest in this field. In last year itself there have been several investments from key VCs in the area of hardware.

Related Links:

Related Keywords

Artificial Intelligence, DNN, CNN, NVIDIA, CUDA