Chef

“Chef” is an automation platform for DevOps. It helps you manage the configuration of your server(s) in an automated fashion. The configuration includes pushing of certain patches or to push certain configuration files to a set of servers in your server farm. All such activities could be managed using this platform.

Chef
Chef carefully crafting her recipes

What problem does Chef solve?

As your infrastructure grows, it becomes difficult to maintain all the servers at all the times. You may have all your server on-premise (very rare these days), or completely on the cloud or a hybrid setup. Irrespective of the location of the servers, you would want all of them to be manageable. You would want to push changes to the configuration in a jiffy without worrying about how many servers you need to update. In case of 5-10 servers, you may want to do it manually. But as the number grows, this task becomes daunting. And this is precisely where Chef platform comes to the rescue.

How does Chef work?

Chef helps you look at your infrastructure as code! What does that mean? One can write what kind of configuration she wants on a given set of servers – this is known as a recipe. Each such recipe can be tested and pushed to target servers referred to as “nodes”. The recipe is then executed to update the server configuration as mentioned in the recipe. All this can be executed from a single machine – no matter what is the number of servers which need to be updated.

Each node registers itself with the Chef server and pulls the recipes which need to be executed on that node. Even after execution, it periodically checks with Chef server, if new recipes are available. This allows the administrator to push the changes in the configuration without having to log/connect to individual servers.

Chef Block Diagram
Chef Block diagram: Source: https://docs.microsoft.com/en-us/azure/virtual-machines/windows/chef-automation

A collection of recipes is called as a cookbook. There are several such cookbooks are available out of the box or one can pick up a ready cookbook from the marketplace “Chef supermarket”. Of course, you can create your own recipes/cookbooks as well.

Chef has built-in support to build, test, and deploy the recipes. It also provides tools such as Chef Automate, which allows you to monitor and collect data across the servers from a single dashboard. It notifies you if any server is not as per the configuration so that you can fix it.

Because of such features, Chef forms one of the important portions of DevOps toolset.

Related Links

Related Keywords

DevOps, Puppet, Ansible, Saltstack, Fabric

ART – Android RunTime

In simple terms, ART stands for Android RunTime environment i.e. environment used by the Android operating system to run the applications. This is a successor to earlier virtual machine “Dalvik” which was part of the Android operating system until Android 4.4 (KitKat)

What are the key features of Android Runtime (ART)?

ART introduced use of “Ahead of Time (AOT)” compilation of all the applications. When an application is installed on Android system, Java bytecode is translated into machine instructions during the compilation process. This improves the application execution performance tremendously. In earlier days when Dalvik was part of the Android OS, the applications used to be compiled “Just In Time (JIT)” before execution. This used to slow down the execution performance of the applications.

Android Runtime - Android Software Stack
Android Runtime – Android Software Stack. [Source: https://developer.android.com/guide/platform/index.html]
You would be wondering why would someone choose JIT compilation. Remember those were the days when mobiles were not as powerful as today’s mobiles. They used to have lower storage and Android was forced to have lower sized applications as a constraint. The compiled versions of the applications tend to be slightly higher in size and hence this trade-off.

Another drawback of Dalvik Runtime – it used to consume battery as it used to compile the code everytime application is started.

By introducing AOT, Android Runtime saves a lot of battery consumption and also improves the execution performance. Albeit there’s a small price to pay – application installation takes longer as compared to the installation times on Dalvik. This is because the code is compiled during installation. However, as you would agree, this is a small price to pay.

Among other benefits of ART, memory allocation and garbage collection are important ones. It also provides some debugging features and high-level profiling of the applications.

ART provides backward compatibility with Dalvik. So if your application runs well on ART, it will work well on Dalvik too. However, the reverse may not be true.

Related Links:

Related Keywords

Java Runtime Environment (JRE), Dalvik, JIT, AOT, JVM, Bytecode,

TensorFlow

As a leading company in Machine Learning, Google had build DistBelief as a proprietary Machine Learning framework. As it’s usage grew, they decided to build the second generation of this library. This also resulted in code refactoring, making it faster and in general elevating the robustness of the application. This new version of Machine Learning framework was named as TensorFlow and was also made “open-source” by Google.

TensorFlow
TensorFlow is popularly used with Python API

Basics of TensorFlow

Tensor is a multi-dimensional array of base data types. The name TensorFlow is derived from the word “Tensors” and the operations done on the tensors. Machine Learning is all about data and hence tensors are the basis of the framework used for Machine Learning. It is written using C++ and CUDA (Nvidia’s language for programming GPUs). However, it provides APIs for Python, C++, and Java in addition to few others.

TensorFlow allows you to create models for machine learning and also allows you to train those models. Vary famous example usage of TensorFlow is Image recognition. You can use supervised learning method to train the dataset using TensorFlow and then use that model to use it in your application.

While there are some alternatives available, TensorFlow is gaining popularity among the community interested and working in Machine Learning field. It’s APIs are easy to use and allow students and researchers to iterate and create models quickly. It has a high level of accuracy and also has the ability to run on several parallel processors using server farms of GPUs and TPUs (Tensor Processing Units).

Interesting Use cases:

  • Image recognition – identify and name objects/persons in a given image
  • Text-based applications such as translation and language detection
  • Text summarization
  • Voice or sound recognition

One of the interesting use cases that I came across was by a company called “Connecterra”. They are using wearable sensors to collect data from cows on farms. Data collected is passed through TensorFlow. A model has been created which allows detecting possible health issues for the cows and giving advanced intimation to farmers. The company is also claiming to show 30% more yield for their customer farmers.

Related Links

Related Keywords

Machine Learning, Supervised Learning, GPU, CUDA, Python

 

Scala

Scala is a functional programming language which is gaining popularity over past few years. There are some key benefits of using functional programming and this language in particular.

Scala Programmer
Scala Programmer – representative image 😉

What is Scala?

As mentioned above it is a functional programming language which compiles the code into Java bytecode. Once compiled, this bytecode could be run on any JVM (Java Virtual Machine). Scala can directly use Java libraries as well as it’s own libraries. It is also known for statically typed nature. It simply means the code gets checked thoroughly during compile time itself for all the variables and their types. This helps in identifying bugs early on and reducing the issue identification during runtime.

Scala was first introduced in 2003 and was an attempt towards creating “Better Java”. As a result, it has features of OOP as well as functional programming built-in, making it very powerful language to use. It’s compiler has capabilities to infer quite a few things about the variables, functions, and classes. This, in turn, helps developers write very concise code.

Key Features

  • Type inference
  • Singleton Object
  • Immutability
  • Powerful Pattern Matching for switch/case statements
  • Higher-order methods (methods that take methods as input)
  • Traits (Improved version of Interfaces)
  • Concurrency

This language has been considered as best language to support concurrency. It considers the availability of several cores such as in GPUs which can give you benefits of parallel execution.

Essentially, Scala can do everything Java can do and also in a better fashion!!

Related Links

Related Keywords

Functional Programming, Java, Big Data, Hadoop

 

GraphQL

GraphQL is a query language that was developed by Facebook for their internal development. However, later it was made open source in 2015. It is gaining traction since then but has a lot to cover to be called as popular.

GraphQL
GraphQL – query language to fetch and manipulate data

What are the main advantages of GraphQL?

GraphQL highly simplifies the fetching and manipulation of data, even when the operations tend to be complex. In “fetch” type of query, it provides exact data as requested by the client – no more no less. This makes the communication highly efficient. The important advantage is that the response format is specified by the client and is not decided by the server. Also, the structure of the data is not hardcoded as in REST APIs.

GraphQL makes the complex request very simple to implement as compared to REST APIs. e.g. additional filters and options both are pushed into query string parameters when using REST APIs. And that makes the request unreadable and difficult to understand which one is a filter and which one is an option. whereas GraphQL allows forming queries in a nested format, making it readable as well as easy to process.

GraphQL Server implementation is available in several languages including Javascript, Python, PHP, Ruby etc. Readymade server implementation from Facebook is also available, which uses Node.js.

Types of GraphQL operations

As can be guessed, it supports two types of operations:

  • Fetch data
  • Manipulate data (Create, Update, Delete)

Typically, you use GET request to fetch the data and POST request to manipulate the data. Using GET allows the responses to be cached at various levels and adds to the efficiency of the overall architecture.

Both REST and GraphQL support evolution of APIs i.e. support for additional fields and deprecation of unused fields, without impacting any of the endpoints.

You can read more about REST and GraphQL here and here. But essentially, both have their own advantages and in a microservices architecture, both could live together happily!!

Related Links

Related Keywords

Microservices Architecture, REST, Node.js