Data - Dashboards - Decision Services : Generalizing the Cricket game platform.

Complex things are explained through the familiar.

Explaining the Cricket platform

The all too familiar cricket platform can be classified as below.


Data

The cricket platform is wellknown  to all. There are various websites like cricinfo.com, cricsheet.org, starsports , ESPN ,ICC etc. The platforms have been collecting data about players , teams and matches since many years . The statistics about which is displayed on the screens for every match.

Dashboards

Imagine a cricket match on TV without the dashboards ! 

The dashboards display the statistics about the team , the players , realtime match scores and updates . A name for it is Role-based dashboards. 

Decision Services 

Thus, statistics plays an important role in cricket too.

 The icing on the cake is the Duckworth Lewis Method (DLS method). A data-driven match decider which can make decisions even when less than half a match is played. The ball tracking system too decides LBW using physics methods.

Problems of Real world J2EE 

Using J2EE with EJBs its a complex task to build such an all too familiar platform. Many software companies have built such dedicated platforms for Mining , Logistics , Travel etc. in crude ways. Repeating the wheel of gathering resources , project design and execution. 

A scalable platform using Data and Algorithms

When everyone is building almost similar type of projects . Why not provide readily available resources which can be used repeatedly. Hence a ready to use platform without the constraints of performance viz, algorithm complexity, compute power, storage and networking was thought upon. Libraries  implementing familiar analytics which can be a quickfix whereever necessary. Data from multiple sources that can be accessed seamlessly without any bottlenecks. Realtime Dashboards linking visualizations with data and decision making algorithms . Another advantage for such a platform being that even not too skilled and experienced employees too can build great projects in a short amount of time.

More explanation on this can be found at this link

 How I transformed the J2EE architecture into a Data and Algorithms/Services  architecture. 

Thus was born The Stratagem Innovation Platform©

Bringing NASA technologies down to earth !

Now, everybody's ambition is to work at NASA and explore the universe. But, NASA doesn't have a job to engage everyone. So, how about working on the same technologies that NASA does , albeit , more down to earth and contribute to the society we live in. Well , that takes some understanding of what NASA actually does.

Lets explore that...





The big picture is that NASA fires rockets and launches satellites and probes all over the solar system. But actually, all these satellites and probes carry imaging devices , sensors  , transponders , spectrometers etc ., which collect data as part of various scientific experiments and transmit the collected data down to earth. And the data is analyzed and visualized and decisions made at the NASA facilities by various super computers.That makes the NASA platform. And similar is the approach taken by other space organizations viz. ROSCosmos , ESA , JAXA , ISRO , CNSA etc., And they all have been collaborating in many ways since many years with data sharing and analytical methodologies. 

Us Earthlings ...

Well , its year 2001 , there is ISS hovering around and NASA sends its first orbiter and rover to Mars and we see the first pictures of Mars after a 4 months wait.


 Meanwhile , on earth the software industry is crashed amid the DotCom doom and 9/11 happens and people have lost their jobs and livelihoods. Awww ... awkward .

Stratagem Software Services is born.

The Strategy.
With a mission to build sustainable technologies that can engage the masses.
To transform existing J2EE architecture into a realtime platform for data collection and analysis. Made for building websites and banking transactions J2EE is severely incapable for any scientific application development. By following a NASA model the new architecture for Data and Algorithms was thought upon. A detailed explanation can be found in another blog . 

Explaining in a simple example

A mini version of Satellites and Probes - newly developed Pedometers, Thermometers , BP Monitors etc,. were displaying the activity as standalone digital instruments. If their data was captured for analysis  wonderful insights could be drawn. By statistical analysis of individual and population data new insights could be drawn. And to analyse and make decisions out of such data Supercomputers were needed and various reusable algorithms could be developed called Services.

Thus were born IoT devices and the Cloud platform on the earth.

Nebula project

NASA was the first to implement the collaborative architecture when they built Nebula which they open sourced and is now called OpenStack. Overtime many new tools were developed.

DataScience : A future for mathematics and sciences.

At school , college or the university , Mathematics as a subject has been taught to sharpen the intellect. Solving the trickiest problems and modelling complex assumptions.All this using a paper and pen. But when it comes to the real world , graduates of Mathematics are shunned and confined to the teaching profession. Multiple reasons being 

  1.  lack of training on tools and software
  2.  unfamiliarity of frequently used engineering methods
  3.  ignorance of real world usage and applications
  4.  others not knowing how to utilize their skills
To solve all the above issues a strategy was thought upon to make mathematics more practical. 

Datascience : Mathematics in practice 

Whether its Ramanujan's  Number theory , Euclidean geometry or Pythagoras theorem - all deal with data and formulas to manipulate the data. When it comes to the modern age of computers it becomes possible to implement the formulas as algorithms to manipulate the data. Hence there's no reason maths graduates should be confined to pen and paper. Rather explore their skills and make themselves more practical utilizing the big machines and solve the complex of problems without much stress or strain. What remains is to gain some familiarity of where mathematics is being used and how to improve upon that !

Usage of Mathematics

Whether its Physics , Chemistry or Biology - Psychology , Sociology or Criminology - Astronomy , Meteorology or Oceonography - Genetics , Diagnostics or Forensics and all the other subjects taught at the school or university. Mathematics has been an integral part of the subjects knowingly or unknowingly at various levels of complexity. Overlooking the multidisciplinary aspect of the subjects was the sin. Bridging this gap and building a multidisciplinary platform for mathematics becomes the fundamental reason for DataScience. Statistics , probability, optimization and logic playing an important role.

Datascience for scientific discovery

Although experimentation is an important aspect of scientific discovery, the data analysis performed on experimental data forms the crux of research papers. The modelling ,analysis ,reporting, decisions and conclusions from the data decides the quality of the research. And to perform such unquestionable research there needs to be continuous innovation of the models and methods. Mixing and matching multiple data sources can enhance the output and more processing power provides the speed to put research on the fast track. Thus amazingly datascience plays a pivotal role in modern research.

Datascience in automation

Robotics , Cyber Physical systems or Electromechanical have been in existence over 50 years. The fundamental drawback in the systems being their limited memory and processing power limiting their functionality. With applications hard coded into the limited memory over an RTOS. Reimagining their functionality as data-driven systems responding to stimuli makes their development far more streamlined. With ability to embed more complex application logic their performance becomes extremely flexible doing away with human intervention. That's real automation.

Event driven systems , Machine Learning and Artificial Intelligence too have been in existence with limited success. Partly due to limited resource availability , affordability and minimal practical use. But when we reimagine their functionality as data-driven their use becomes extremely flexible due to the algorithm run software defined approach.

A lot has changed with a change in architecture. Applying the data driven approach wonderful new technologies have developed. Be it the drones in the sky or automated cars on the roads. Precision robots or optimized factories. Mathematical Biology or Analytical Chemistry. All utilizing their domain specific data and unprecedented computing power , otherwise impossible earlier.



How I transformed the J2EE architecture into a Data and Algorithms/Services architecture.

 One of the biggest drawbacks of the J2EE platform was its exclusive focus on secure transactions. Although great for banking and financial services , the inability to utilize the computing power behind it was a great handicap. After all computers were made for computing ! And you need mathematics to do it. Ignorance breeds contempt when the people at the helm are MBA's , BE/BTechs and MCAs. Easy money J2EE was the culprit.

When scientific applications are developed , they rely on the computing power , the data presented to them and the logic and algorithms to analyze it. J2EE was a no-brainer to develop such applications. Hence a new platform was the need.

All the previous attempts failed because of their inaccessibility. Super Computers were big boxes stashed away in one corner of the world running exclusive applications for a select few people. J2EE driven Dot-Com doom was for the remaining people of world.

With Datacenters being built across the world, with advertising being the mode for financial returns . It was the need of the hour to utilize the power of the datacenter and a new business plan catering to the scientific needs of the rest of the world.

Thus was developed the Data-driven services based onDemand platform by Stratagem Software Services . An architecture with seamless access to data from multiple datasources, abundant computing resources , convenient access to algorithms and a culture of scientific analysis and presentation over a pay-per-use model. Affordability,scalability and flexibility being top priorities. An architecture that utilized the regular datacenter capacity in a more capable way.

Bringing a paradigm transformation in the scientific culture. Changing the norm from experimentation to Data-analytics driven scientific discovery. A reusable set of algorithms/applications that could be implemented across multiple scientific domains. Making it convenient for the scientific community to focus on their work rather than on the working of the operating system of the computer. Saving enormous amount of time by utilizing the abundant computing power. Paving the way for rapid research , results of which ,otherwise , were years away.

Aptly named 

The Stratagem Innovation Life Cycle(SILC)

and 

The Stratagem Innovation Platform©.

The concept and purpose

China as a country with exploding population did not complain but rose to the challenge. To cater to the rising numbers they started building suitable infrastructure. They built wide roads, as many skyscrapers, huge swimming pools, large railway networks and wide doors and windows to accommodate everyone.

In contrast, in a country like India, whose population is increasing at a rate more than that of China, we haven't comprehended the reality. We try to cut down the explosion by raising the prices, increasing the queue lines and pushing more people into the cattle class or make good an escape to America. So that the privileges remain to the selected few. Its high time India builds its rails,roads,temples and monuments to accommodate its population. India needs extra broad gauge rails and wider doors for its temples! Tradition is not a solution.

In the same vein, the internet and its data centers were built in its early days for a small number. With its penetration increasing by leaps and bounds, the net infrastructure too needs to be expanded to accommodate the billions. Thus the internet architecture deserves a rethink.

As we all realized sticking together 150 companies by spending $25 billion doesn't make it a data architecture ,rather the same old wine in a blue bottle! The applications which were built for the old world order cannot solve the modern problems. We do need something better than SAP and oracle.

Hence to build such a solution catering to the billions, a new concept was thought upon. By democratizing the data about the population and then building the necessary infrastructure to understand the data a genuine problem is solved. Adding infiniband networks ,more processing power and abundant storage to the data centers have brought supercomputers into the hands of the common man. And building applications for the new hardware have drastically reduced the cost and reach of the softwares. And thus one huge population challenge has been tackled.

Now what remains for the ISP's in India is to build the necessary networks and maintain net neutrality!

What could be a service ...

     Frankly , slapping an ip-address to an existing application and pushing it behind a pay wall and selling a hundred such services as webservices and advertising extensively as a newly minted architecture called SOA by companies like IBM can carry away gullible people . But such focus-less short termed architectures damage the real intent behind a concept and mislead people.
Wasting a lot of time and money.

      The real purpose of a service is far more involved and deep. Essentially,  services were needed to remove the limitations surrounding application development. A few of the previous limitations overcome by services are as follows.

1. By removing  the limits on the processing power available for an application , the scope of a program is greatly widened. Bigger, more intelligent and automated solutions are possible.

2. An application available as a pay per use service instead of an installable app is more affordable and the reach too is global.

3. One of the most important aspect not to be missed out is to understand that a service is essentially manipulating data. Such an understanding makes the service more global, your application remains the same only the input data is changing.

4. Also on removing the limitations on the amount of data one can build extensive and long lasting applications which are also intelligent.

That's just the tip of an iceberg !!!


The data strategy

  Indians are great at solving American problems! Not that India is problem free, but for the money involved in finding the solution. Since liberalization Indian engineers have been writing loads of code for the first world countries and thus exploiting their markets. That they are capable is not the question, but they aren't using their skills for building India is the point of contention! Well, lets change this trend.

  By creating a roaring market for the software applications in India and the third world we can achieve a certain humanitarian goal. The entire world can utilize automated solutions. Simplify their laborious tasks. And make bundles of money!

  To achieve that goal a quiet tricky strategy was thought upon. 

1. Convert the software into reusable services which can be paid per use.So that the complexity & quality of software is not compromised.

2. Let the services not be mere pieces of software but be manipulating data. And as the data changes, be it American or Indian or Nigerian, the ensuing quality of solution remains top class all at a nominal cost. 

  That makes the software coded by Indians more global and even Indians too can use the software built by them.