building-an-analytics-coe-center-of-excellence-5-638

What is Analytics?

Data analytics (DA) is the science of examining raw data with the purpose of drawing conclusions about that information. Q’Zure uses Data analytics to allow organizations and end users to make better business and patient-centered care for hospitals, boards, and patient-centered decisions for healthcare industry specific tools and in the sciences to verify or disprove existing models or theories. Data analytics is distinguished from data mining by the scope, purpose and focus of the analysis. Data miners sort through huge data sets using sophisticated software to identify undiscovered patterns and establish hidden relationships. Data analytics focuses on inference, the process of deriving a conclusion based solely on what is already known by the researcher. The science is generally divided into exploratory data analysis (EDA), where new features in the data are discovered, and confirmatory data analysis (CDA), where existing hypotheses are proven true or false. Qualitative data analysis (QDA) is used in the social sciences to draw conclusions from non-numerical data like words, photographs or video. In information technology, the term has a special meaning in the context of IT audits, when the controls for an organization’s information systems, operations and processes are examined. Data analysis is used to determine whether the systems in place effectively protect data, operate efficiently and succeed in accomplishing an organization’s overall goals.

The term “analytics” has been used by many business intelligence (BI) software vendors as a buzzword to describe quite different functions. Data analytics is used to describe everything from online analytical processing (OLAP) to patterns and trends. Healthcare examples are discovering risk factors and antiemetic’s that cause Post Nausea and Vomiting (PONV) or system trend preventing CAUTI’s and physician / specialty specific quality of care, documentation ACGME practice evaluation, root cause analysis (RCA) and human error analysis (HFACS) also used in multiple industries.  Modern data analytics often use information dashboards supported by real-time data streams. So-called real-time analytics involves dynamic analysis and reporting, based on data entered into a system less than one minute before the actual time of use.


statistics-vs-analytics

Analytics Word Cloud

Natural language processing, often abbreviated as NLP, refers to the ability of a computer to understand human speech as it is spoken. NLP is a key component of artificial intelligence (AI) and relies on machine learning, a specific type of AI that analyzes and makes use of patterns in data to improve a program’s understanding of speech. Q’Zure uses multiple tools to help assist in structured and unstructured data. Our team is lead by Dr. Sherif Elfayoumy, Director of Computer Science at the University of North Florida. His immense background in this field stems from being one of the very first innovators and pioneers of NLP and AI. Dr Elfayoumy maintains his professorship at UNF, guest speaking through various universities and developing new tools, with multiple patents in and around AI and database architecture and structure using a vast array of tools and  IT solution software.


Machine Learning is the only kind of AI there is.

AI is changing. We are now recognizing that most things called “AI” in the past are nothing more than advanced programming tricks. As long as the programmer is the one supplying all the intelligence to the system by programming it in as a World Model, the system is not really an Artificial Intelligence. It’s “just a program”.

Artificial Intelligence

Artificial Intelligence

Don’t model the World; Model the Mind.

When you Model the Mind you can create systems capable of Learning everything about the world. It is a much smaller task, since the world is very large and changes behind your back, which means World Models will become obsolete the moment they are made. The only hope to create intelligent systems is to have the system itself create and maintain its own World Models. Continuously, in response to sensory input.

Following this line of reasoning, Machine Learning is NOT a subset of AI. It really is the ONLY kind of AI there is.

And this is now proving to be true, and in a big way. Since 2012, a specific Machine Learning technique called Deep Learning is taking the AI world by storm. Researchers are abandoning the classical “Programming Tricks” style of AI in droves and switching to Deep Learning… based mainly on the fact that it actually works. We’ve made more progress in three years since 2012 than we’ve done in the preceeding 25 years on several key AI problems, including Image Understanding (a really hard one), Signal Processing, Voice Understanding, and Text Understanding.

Another clue that we are now on the right track: Old style AI projects like CYC ran to millions of propositions or millions of lines of code. Systems that (successfully) Model the Mind can be as small as 600 lines of code; several recent Deep Learning projects clock in somewhere in that range. And these programs can move from one problem domain to another with very few changes to the core; this means these methods are GENERAL intelligences, not specific to any one problem domain. This is why it is called Artificial General Intelligence. And we’ve never had any AI programs that could do this in the past. As an example, the language understanding programs we are creating using DL will work equally well in any language, not just English. It just takes a re-training to switch to Japanese… another indication that Deep Learning is closer to true intelligence than traditional NLP systems.