ICO

ICO means Initial Coin Offering. This is a latest buzzword and latest way startups are raising money. The moment I say “raising money”, you will be tempted to compare this term with IPO – Initial Public Offering. However, hold on. They are similar yet very different.

ICO
ICO – Initial Coin Offerings increased multifold in 2017

What is ICO then?

In ICO, a company sells tokens or coins to people of entities who are willing to invest their money in the company’s project or product. The coins or tokens can be purchased in lieu of other digital currencies such as Bitcoin or Ether or  even with real money (e.g. USD). The company who is selling the coins announces their purpose of raising the money. Investors who believe in that purpose, purchase those coins. If company’s project or product is launched and successful, one can expect the value of the purchased coins going up. And then the investor can sell his/her tokens and make profit.

All this sounds familiar with IPO, right?

Let’s talk about the real difference.

When you invest money in the IPO, you get shares in the publicly traded company. You get voting rights, you get ownership in the company equivalent to your shares. You could decide to increase your stake in the company. The company is required to disclose all the accounts, audit reports and is regulated by central agency such as SEC (US) or SEBI (India).

However, in case of ICO, you do not get any ownership of the company. Neither you get any voting rights nor you can hope to increase your stake in the company. The only thing that you can hope and can expect is that the tokens would increase in their values and you get your returns.

ICO can also be referred as crowdsourcing through cryptocurrency.

Facts about ICO

  • First token sale – July 2013
  • Tracked via CoinDesk
  • Money raised until now – USD 3.3B

Related Links

Related Keywords:

Cryptocurrency, Bitcoin, Blockchain

OLTP

OLTP mean On-Line Transaction Processing. This is a type of system which is used in applications involving several short transaction.

Important Features of OLTP databases:

  • Transactions involve very small amounts of data
  • Data Integrity
  • Indexed access to data
  • Performance focus is on Transactions Per Second

Examples of OLTP Databases:

  • MySQL
  • SQL Server
  • PostgreSQL

OLTP Databases are ACID compliant. It means that the database ensures that the transactions are “Atomic”, “Consistent”, “Isolated” and “Durable”.

  • Atomic means either the transaction is successfully recorded or it is rolled back. It is “all or nothing” operation. This ensures in case of the failure, database state is left unchanged.
  • Consistency means every transaction will bring database from one valid state to another. That is data written to the database should be following all the applicable DB rules. Note that this doesn’t involve application level logic.
  • Isolation means database reaches same state whether the transactions are executed in parallel or in sequential manner.
  • Durability means once a transaction committed it remains stored even if there are other issues such as power failure or crashes.

Applications of OLTP databases:

  • Banking applications
  • Retail orders

Typically large applications make use of both OLTP and OLAP (OnLine Analytical Processing) databases. OLAP focuses torwards analysis of data and may not have performance (transactions per second) as compared to OLTP databases. You can find detailed comparison here or can refer to this presentation.

Related Links:

Related Keywords

Databases, OLAP, ACID

Subscribe now. You will get relevant updates frequently. Also, please check our other terms from menu. #TermOfTheDay #ATermADay

Blockchain

Blockchain in simple terms is a chain of blocks. In computer science terminology it is a singly linked list. Each block stores a pointer to previous block along with some data. And hence the name.

How is Blockchain stored?

Blockchain is a distributed ledger comprising of several computers referred as “nodes”. Each node stores entire copy of the chain. Each such copy is independently verifiable. Whenever a new node joins the network, it receives entire copy of the chain.

Characteristics of Blockchain:

  • Distributed – As mentioned above, there is no single node which is controlling the blockchain. As a result this makes it a distributed system.
  • Robustness – Since it is a distributed system, it gives robustness to the system. Failure of any single node doesn’t make the system inaccessible, unusable or unstable.
  • Secure – The data in each block could be encrypted. Each block maintains the reference to previous block. The reference for the current block is derived by using encryption algorithm i.e. hashing the data. As a result, if the data inside the block is changed, the hash also changes. This in turn invalidates rest of the chain on that node. In a distributed system, such nodes get rejected and hence once your data is added to blockchain it is nearly impossible to modify that record.
  • Transparent – The transparency is only to the extent that every node has a complete copy of the chain. However, data inside each block could be encrypted and hence not every node would be able to read the data.

But what is the use of blockchain?

It has several possible uses. One which currently a buzzword is – Bitcoin – cyrptocurrency. More about this sometime later.

Among other uses, one can use this technology to maintain land records or medical history, audit trail or insurance claim. Typically, where you need to ensure sanctity of entire record, you could put blockchain to use.

Related Links

Related Keywords:

Cryptocurrency, Bitcoin, Ethereum, Cryptography

API Gateway

API Gateway is a server that is a single entry point to your system. As you can imagine, primary function of this server is provide various API endpoints to various clients. It also hides the backend services from the client.

Why should I use API Gateway?

Consider a complex architecture where variety of clients are accessing your system. In this age, monolithic applications are getting outdated whereas microservices based applications are popular because of their efficiency and maintainability. As a result, all of the clients will need to make several calls to get data from various services from your system. So even to render one page, client will end up making several calls. This would be a problem on mobile network, which typically has latency issues. To add to this complexity, each client could have their own requirements. That could also mean client specific code.

All these problems can be solved by implementing an API Gateway. This server provides single point of reference for all the clients. It detects the client and breaks the request into multiple backend requests to fetch the data. This gives additional benefit of consolidating responses from all the backend requests into single response for the client.

API Gateway simplified diagram
API Gateway [Source: https://www.nginx.com/blog/building-microservices-using-an-api-gateway/]

Are there any drawbacks?

Yes, as is the case with almost everything in life! If due care is not taken, API Gateway itself can become bottleneck and heavy. All developers would be required to update API Gateway whenever they make changes to the endpoints or protocol to access their respective services. API Gateway gathers data from multiple webservices. As a result, failure of even one service could lead to unavailability of entire service. Or it could add delays in response to the client.

However, there are already ways to counter these drawbacks. Proper process such as DevOps could lead to removal of API Gateway becoming developer bottleneck. Usage of circuit-breaker libraries such as Netflix Hystrix could avoid overall service breakdown even in case of partial service outage.

Reference Links:

Example Implementations / Providers:

  • AWS API Gateway
  • Azure API Management
  • Vertx
  • JBoss Apiman

Related Keywords

Microservices Architecture, AWS, Azure, Hystrix

Supervised Learning

Supervised Learning is a methodology in Machine Learning field. In this methodology, an algorithm is developed based on known dataset and known observations from that dataset. Once the algorithm is stable, researchers / developers use it on new but similar dataset to get the observations about that dataset.

In this method, the known relationship between the dataset (training data) and observations (outcome) helps the algorithm to improve. This is kind of a teacher supervising students in learning new technique. And hence this method is referred as “Supervised Learning”. The developer keeps on improving the algorithm until it reaches fairly accurate outcome for all of the training set.

When to use Supervised Learning?

You have a training data available/gathered. After manual analysis, you know the expected outcome. And then you are required to find out the outcome on another similar dataset for which outcome is not yet available. This is an ideal condition to use Supervised Learning.

Tasks involved in Supervised Learning

Typically there are two types of tasks involved in this type of learning.

  • Classification: In this case, the algorithm assigns a category to the input dataset. e.g. If the training dataset is a set of files, this algorithm will categorize each file as text file, image file or binary file.
  • Regression: In this task, algorithm will predict a numerical value based on training dataset.

Developers often need to consider “bias vs variance trade-off” while determining the accuracy of the algorithm. Sometimes the algorithm consistently produces incorrect output for given input. This is referred to as “bias”. Sometimes, algorithm produces different values for same input. This is called as “variance”. It is usually impossible to have lowest of both bias as well as variance and hence a balance of these two are required. When such balance is reached, developers can start using that algorithm on different datasets and continue to improve.

Example of Supervised Learning:

Let’s say you have 20 photos and each of them are tagged with labels such as person name, location, type of photo. In this case you will develop a model using this information. Once done, you can feed another 20 photos to this model and see if model has “learnt” from earlier dataset.

Fun Fact:

You can find information on Facebook photos “alt” tag – “Image may contain: mountain, sky, outdoor” OR “Image may contain: One Person, Standing, outdoor” etc. This looks like AI running on the photos through Supervised Learning model.

Reference Links:

Related Keywords:

Machine Learning, Unsupervised Learning, Semi-supervised Learning, Active Learning