Tools of the Trade
This is note to self post. I am listing here, some of the tools that i use on a daily basis. Some of these tools are just awesome, like powerline.
This is note to self post. I am listing here, some of the tools that i use on a daily basis. Some of these tools are just awesome, like powerline.
Working with hadoop involves working with huge amounts of data. It also, at times, involves moving huge amounts of data from traditional data stores such as mysql and oracle. Apache
Hadoop has become de-facto standard for big data and batch processing. Think of data pipeline and end up with Hadoop. The Hadoop eco system is changing and changing at a
Hackernews is an awesome place. It keeps people involved and interested. There are articles, questions, answers and debates. One such debate going on is about Dependency Injection. It was declared
There are several tools/framework available that help process data as it arrives. I had done a comparative study of below four systems in the past: Apache KafkaFacebook ScribeCloudera FlumeApache