Abhiruchi Khandelwal

Data Scientist from India

About

Driven by curiosity, Persistence, and beautiful community around me I never gave up, coming from non-mathematics background it was not easy to manipulate the algorithm but i never gave up, tried hard and today i am standing with a bag full of knowledge and still learning several crazy stuff and enjoying them.
Given 5 year to coding.

Knowledge

Probability and Statistics

1. Difference between Statistical and Probabilistic Domain
2. DIfference between Sample and Population
3. Deterministic and Stochastic variable
4. Random Variable: Qualitative and Quantitative
5. Statistical Domain
    5.1 Understanding beauty of Statistics
    5.2 Different Statistics of sample data of Quantitative Random Variable
    5.3 Frequency and Relative Frequency
    5.4 Frequency and Relative Frequency Distribution of Random Variables
    5.5 First,Second and Third Quartile of a sample of data
    5.6 Percentile
    5.7 Coded all above for better Understanding in Python #links
6. Probabilistic Domain
    6.1 Mathematical meaning of Probability (Limiting case of relative frequency)
    6.2 Conditional Probability
    6.3 Events (dependent, independent, mutually exclusive)
    6.4 Types of Quantitative Random Variables: Continuous and Discrete
    6.5 Univariate Probability Distributions
    6.6 Individual Univariate PDF
    6.7 Some Continuous Random Variable Probability Distributions:Normal,Standard Normal,Rayleigh
    6.8 Cumulative Probability and the Distribution Functions
    6.9 Z-score in Standard Normal Probability Distribution
    6.10 Multivariate Probability Distributions
    6.11 Different Multivariate Joint Probability Distributions
    6.12 Joint Multivariate Normal Probability Distribution
    6.13 Joint Multivariate Normal Probability Distribution Function
    6.14 Coded all above for better Understanding in Python #links
7. Frequentist Inferential Statistics
    7.1 Sampling Distribution and CLT
    7.2 Python code for Sampling Distribution and CLT #links
    7.3 First, Second, Third and Fourth order moments, Skewness and Kurtosis of Distributions.
    7.4 Likelihood Functions
    7.5 Point Estimation Of Population Parameters.
    7.6 Confidence Interval Estimation of Population Parameters
    7.7 Large Sample Hypothesis Testing for means of one and two populations
    7.8 Student-t Distribution
    7.9 Small Sample Hypothesis Testing for means of one and two populations
    7.10 Chi-Square Distribution
    7.11 Small Sample Hypothesis Testing for variance of one population
    7.12 F Distribution
    7.13 Small Sample Hypothesis Testing for variance of two populations
    7.14 ANOVA : Small Sample Hypothesis Testing for variance of multiple populations
    7.15 Code of above tests #links
    7.16 Pearson Correlation Analysis
    7.17 Pearson's Chi-Square statistic for analysis of categorical data
    7.18 Non-Parametric Inferential statistics
    7.18.1 Frequentist Inferential Statistics for Qualitative Data
    7.18.2 Converting Qualitative data into rank
    7.18.3 Wilcoxon Rank Sum Test
    7.18.4 Wilcoxon Rank Sum Test for a paired experiment
    7.18.5 Kruskal Wallis H-test for completely Randomized Design
    7.18.6 Friedman Rank Test for Randomized Block Design
    7.18.7 Rank Correlation Coefficient
    7.18.8 Codes for above #links
8. Bayesian Inferential Statistics
    8.1 Bayes Theorem
    8.2 Prior and Posterior Probability
    8.3 Different interpretations of Bayes Theorem
    8.4 Application of Bayes Theorem

Linear Algerba

1. Matrix
2. Row and Column Interpretation of a Matrix
3. Rectangular and Square Matrices
4. Singular Matrix
5. Matrix as a set of vector
6. Transpose and Inverse of a Matrix
7. Conditions on invertibility of a Matrix
8. Gauss Jordan Elimination Method for solving system of linear equations in matrix form
9. Rank of a Matrix
10. Full rank Matrices
11. Vector Spaces and Subspaces
12. Null space, Left Null space,Column Space and Row Space of a Matrix
13. Linearly Dependent and Independent Vectors
14. Row or Column Space spanned by Basis Vectors
15. Orthogonal Vectors
16. Orthogonal Matrices
17. Orthonormal Matrices
18. Approximate nearest solution to the system of Linear Equations when there is no solution
19. Projection of a vector onto another vector
20. Projection Matrices
21. Gram Schmidt Orthogonalization for getting Orthonormal Basis Matrices
22. Determinant and Trace of a Matrix
23. Eigenvectors and Eigenvalues
24. Symmetric Matrix
25. Matrix Factorization
26. Singular Value Decomposition of Matrices
27. Positive Semidefinite and Definite Matrix
28. Principal Component Analysis
29. Reduced Dimensionality Analysis
30. Diagonal Covariance Matrix
31. Fisher Discriminant Analysis (pooled covariance)
32. Linear Inequalities
33. Network Models
34. Game Theory

Univariate,Multivariate Calculus And Optimization

   1. Graphs And Models
   2. Linear Models and Rates Of Change
   3. Functions and their graphs
   4. Fitting models to data
   5. Finding limits graphically and numerically
   6. Evaluating limits analytically
   7. Continuity and one sided limits
   8. Infinite Limits
   9. The derivative and the tangent line problem
   10. Basic Differentiation rules and rates of change
   11. Product and quotient rule and higher order derivatives
   12. Chain rule
   13. Implicit Differentiation
   14. Increasing and Decreasing Functions and First Derivative Test
   15. Concavity,Convexity, Non Convexity and second Derivative Test
   16. Unconstrained and Constrained Optimization
   17. Convex optimization
   18. For solving unconstrained Convex Optimization
      18.1 First Order Algorithm:Gradient Descent Algorithm for Univariate Function
      18.2 Second Order Algorithm:Newton’s Method for Univariate Function
      18.3 Code for Gradient Descent and Newton’s Method (Univariate) #links
      18.4 Multivariate Functions
      18.5 Partial Derivatives
      18.6 Directional Derivatives and Gradients
      18.7 Tangent planes and normal Lines
      18.8 Extreme Value Theorem for Multivariate Function
      18.9 First Order Algorithm:Gradient Descent Algorithm for Multivariate Function
      18.10 Second Order Algorithm:Newton’s Method for Multivariate Function
      18.11 Code for Gradient Descent and Newton’s Method (Multivariate) #links
   19. For solving Constrained Convex Optimization
      19.1 Equality and Inequality Constraints, Active Sets and Binding Constraints
      19.2 Feasible Sets, Feasible Points, Interior and Boundary Points
      19.3 Barrier and Penalty Methods for solving Constrained Optimization
      19.4 Primal Problem
      19.5 Primal Problem and its Objective Function
      19.6 Understanding the mathematical path to the Dual to Primal Problem
      19.7 Dual Problem
      19.8 Dual Variables
      19.9 Duality
        19.9.1 Strong and Weak Duality
        19.9.2 Duality Gap
      19.10 Lagrange Multipliers as Dual Variables
      19.11 Construction of Lagrangian Function
        19.11.1 Physical Significance
        19.11.2 Importance
        19.11.3 Derivative of Lagrangian Function
      19.12 Dual Problem and its objective function
      19.13 Necessary and Sufficient Conditions for Optimality
      19.14 Strict Complementarity and Degenerate case
      19.15 Karush-Kuhn-Tucker Complementarity Conditions
      19.16 Solving Primal Problem through Dual Problem

Supervised Classical Machine Learning

1. Bayes Theorem
2. Continious and Discrete Probability Distributions
3. Gradient Descent and Newton's Method #links
4. Naive Bayes Classifier
5. Laplace Smothing
6. Making non singular Matrix
7. Codes illustrating above Probabilistic Classifier #links
8. Linear and Polynomial Regression
9. Underfitting and Overfitting
10.Normalization and regularization
11.Codes illustrating Linear and Polynomial Regression #links
12.Logistic Regression
13.Codes illustrating Logistic Regression #links
14.Regression and Classification Trees
15.Codes illustrating Regression and Classification Trees #links
16.Pruning,Bagging,Boosting
17.Time Series Forcasting
18.Codes illustrating Time Series Forcasting #links
19.Support Victor Machine
20.Codes illustrating Support Victor Machine #links

Deep Learning

1. DNN
2. Hyperparameter Tuning and Batch Normalization
3. Neural Network Just with python to understand the functioning of tensorflow #links
4. Deep knowledge of Computer Vision
5. CNN #links
6. NLP
7. RNN #links

Unsupervised Learning

will write later

Experience

AXIS INDIA MACHINE LEARNING AND RESEARCH LABJob Title

will write later

AXIS INDIA MACHINE LEARNING AND RESEARCH LAB Job Title

will write later

Company Job Title

will write later

Education

School Dates Attended

will write later

School Dates Attended

will write later

Projects

Project Name

will write later

Link Text

Project Name

will write later

Link Text

Project Name

will write later

Link Text

Skills

  • Artificial Intelligence
  • Data Analysis
  • Machine Learning
  • Probability
  • Statistics
  • Deep Learning
  • C++
  • Data Structures
  • Microprocessor
  • Computer Vision
  • HTML

Contact Me

``

Copyright Abhiruchi Khandelwal 2018