Wednesday, July 22, 2009

Timeline of Statistical Analysis

Timeline of Statistical Analysis - A journey into the history of statistical analysis
17th Century
1654 - Blaise Pascal and Pierre de Fermat were the founders of the theory of probability.
1693 - Edmund Halley discovers the mortality tables statistically for the first time relating death rate to age.

18th Century
1724 - Abraham de Moivre further explores mortality statistics and discoveres the theory of annuities.
1733 - Abraham de Moivre was the first to introduce normal distribution to approximate the binomial distribution in probability which was a mystery till then
1761 - Thomas Bayes was successful in providng Bayes theorem

19th Century
1805 - Adrien-Marie Legendre was the first to introduce the method of least squares for fitting a curve for a given set of observations

20th Century
1933 - Andrey Kolmogorov publishes his book Basic notions of the calculus of probability - an axiomatization of probability based on measure theory
1943 - Kenneth Levenberg designs a method for nonlinear least squares fitting
1953 - Nicholas Metropolis defines the idea of thermodynamic simulated annealing algorithms.
1962 - Donald Marquardt discovers the Levenberg-Marquardt nonlinear least squares fitting algorithm.
1980 - Introduction of Computers in Statistical Analysis which latter lead to development of statistical softwares

Tuesday, May 12, 2009

MaxisIT eNalyze for Statistical Computing

MaxisIT’s eNalyze™ software solution under the CT Renaissance™ enterprise suite offers a multitude of functionalities, making it a one-stop-solution for both your software and services needs. MaxisIT has extensive breadth and depth of clinical data analysis and reports development. Our analysis group evaluates the trials in the design phase by utilizing web-based wizard driven interfaces, automated batch and/or real-time production of -

Pre-clinical safety reports
Interim analysis and reporting
Clinical data viewing and ad-hoc reporting
Clinical efficacy and safety analysis and reports
Patient profiles
Non-clinical reports
Progress monitoring reports


Easy and intuitive web-based software for faster, more efficient SDTM validation & Define.xml generation
Web-based, wizard driven, point-n-click reports development
Built-in repository of statistical methods, standard reports ‘templates and shells.
End-to-end traceability of data & referential linking
Comprehensive metadata management
Enables efficient decision making by quick implementation of emerging data standards
Low total cost of ownership (TCO)
Interoperable and can be fully integrated with other systems
Round the clock support
GCP-ICH & 21 CFR Part 11 complaint processes and software infrastructure


Monday, May 11, 2009

Statistical Computing Environment

Statistics is the branch of science that deals with many numbers which are generated in a controlled environment for a given process, which latter on analysis helps understanding the significance of the activity. Statistical Analysis is tedious process and demands the best concentration levels of the statisticians across the globe. A small error in computation may cause the sabotage to the entire study. Thus electronic ways of computing statistics which are accurate, reliable and made simple all together constitute Statistical Computing. This Statistical Computing has a wide range of applications in clinical trials industry

Statistical Computing in clinical trials industry is carried out in a particular regulated and controlled environment, which is referred to as Statistical Computing Environment. In Statistical Computing Environment initially a Statistical Analysis Plan is documented. Few other good areas for statistical computing include Analysis of data files, programming statistical results, generation of compiled reports, Certain Validations according to the regulatory authority guidelines and compliance, we need to publish the relevant results and enable the data view on demand.

Statistical Computing Environment needs to address few topics and concepts which are predefined to ensure the smooth transaction of the entire process. The concepts that needs to be revised for a stable Statistical Computing Environment in place is Metadata concepts, Standard data transformations, capable reporting system, professional and proficient documentation system, publishing the necessary data in compliance to the Audit trial and data security.

Statistical Computing Environment is a huge concept in making and continuously changing aimed at improving the quality and efficacy of the drug in making and bring out a renaissance in current clinical trials process