From 034bcc43d4c5aed32ab052dd60717b70908d880e Mon Sep 17 00:00:00 2001 From: lee Date: Wed, 7 Apr 2021 15:45:31 +0100 Subject: [PATCH 1/4] day1: testing and feedback --- one_md_per_day_format/piscine/Week1/day1.md | 298 +++++++++++--------- 1 file changed, 166 insertions(+), 132 deletions(-) diff --git a/one_md_per_day_format/piscine/Week1/day1.md b/one_md_per_day_format/piscine/Week1/day1.md index 48efc73..b63d92b 100644 --- a/one_md_per_day_format/piscine/Week1/day1.md +++ b/one_md_per_day_format/piscine/Week1/day1.md @@ -1,24 +1,29 @@ -# D01 Piscine AI - Data Science +# D01 Piscine AI - Data Science The goal of this day is to understand practical usage of **NumPy**. **NumPy** is a commonly used Python data analysis package. By using **NumPy**, you can speed up your workflow, and interface with other packages in the Python ecosystem, like scikit-learn, that use **NumPy** under the hood. **NumPy** was originally developed in the mid 2000s, and arose from an even older package called Numeric. This longevity means that almost every data analysis or machine learning package for Python leverages **NumPy** in some way. -Version of NumPy I used to do the exercices: 1.18.1 +Version of NumPy I used to do the exercises: 1.18.1 + I suggest to use the most recent one. -Author: + +Author:
# Outline: (optional) + A. Introduction B. Rules -C. Exercices +C. Exercises ## Rules -... Notebook Colabs or Jupyter Notebook -Save one notebook per day or one per exercice. Use markdown to divide your notebook in different exercices. -## Ressources + +... Notebook Colabs or Jupyter Notebook +Save one notebook per day or one per exercise. Use markdown to divide your notebook in different exercises. + +## Ressources - https://medium.com/fintechexplained/why-should-we-use-NumPy-c14a4fb03ee9 - https://docs.scipy.org/doc/NumPy-1.15.0/reference/ @@ -26,13 +31,13 @@ Save one notebook per day or one per exercice. Use markdown to divide your noteb # Exercice 1 Your first NumPy array -The goal of this exercice is to use many Python data types in **NumPy** arrays. **NumPy** arrays are intensively used in **NumPy** and **Pandas**. They are flexible and allow to use optimized **NumPy** underlying functions. +The goal of this exercise is to use many Python data types in **NumPy** arrays. **NumPy** arrays are intensively used in **NumPy** and **Pandas**. They are flexible and allow to use optimized **NumPy** underlying functions. -1. Create a NumPy array that contains: an integer, a float, a string, a dictionary, a list, a tuple, a set and a boolean. +1. Create a NumPy array that contains: an integer, a float, a string, a dictionary, a list, a tuple, a set and a boolean. -The expected output is: +The expected output is: -``` +```python for i in your_np_array: print(type(i)) @@ -44,13 +49,13 @@ for i in your_np_array: - ``` -## Correction -1. This question is validated if the your_numpy_array is a NumPy array. It can be checked with `type(your_numpy_array)` that should be equal to `numpy.ndarray`. And if the type of is element are as follow. +## Correction + +1. This question is validated if the your_numpy_array is a NumPy array. It can be checked with `type(your_numpy_array)` that should be equal to `numpy.ndarray`. And if the type of is element are as follow. -``` +```python for i in your_np_array: print(type(i)) @@ -62,27 +67,30 @@ for i in your_np_array: - ``` + --- + # Exercice 2 Zeros -The goal of this exercice is to learn to create a NumPy array with 0s. +The goal of this exercise is to learn to create a NumPy array with 0s. -1. Create a NumPy array of dimension **300** with zeros without filling it manually +1. Create a NumPy array of dimension **300** with zeros without filling it manually 2. Reshape it to **(3,100)** ## Correction 1. The question is validated is the solution uses `np.zeros` and if the shape of the array is `(300,)` -2. The question is validated if the solution uses `reshape` and the shape of the array is **(3, 100)** +2. The question is validated if the solution uses `reshape` and the shape of the array is `(3, 100)` + --- + # Exercice 3 Slicing -The goal of this exercice is to learn NumPy indexing/slicing. It allows to access values of the NumPy array efficiently and without a for loop. +The goal of this exercise is to learn NumPy indexing/slicing. It allows to access values of the NumPy array efficiently and without a for loop. -1. Create a NumPy array of dimension 1 that contains all integers from 1 to 100 ordered. +1. Create a NumPy array of dimension 1 that contains all integers from 1 to 100 ordered. 2. Without using a for loop and using the array created in Q1, create an array that contain all odd integers. The expected output is: `np.array([1,3,...,99])`. *Hint*: it takes one line 3. Without using a for loop and using the array created in Q1, create an array that contain all even integers reversed. The expected output is: `np.array([100,98,...,2])`. *Hint*: it takes one line @@ -90,47 +98,50 @@ The goal of this exercice is to learn NumPy indexing/slicing. It allows to acces ## Correction +1. This question is validated if the solution doesn't involve a for loop or writing all integers from 1 to 100 and if the array is: `np.array([1,...,100])`. The list from 1 to 100 can be generated with an iterator: `range`. -1. This question is validated if the solution doesn't involve a for loop or writing all integers from 1 to 100 and if the array is: `np.array([1,...,100])`. The list from 1 to 100 can be generated with an iterator: `range`. - -2. This question is validated if the solution is: `integers[1::2]` +2. This question is validated if the solution is: `integers[::2]` 3. This question is validated if the solution is: `integers[::-2]` -4. This question is validated if the array is: `np.array([[1,0,3,4,0,...,0,99,100]])`. There are at least two ways to get this results without for loop. The first one uses `integers[1::3] = 0` and the second involves creating a boolean array that indexes the array: +4. This question is validated if the array is: `np.array([0, 1,0,3,4,0,...,0,99,100])`. There are at least two ways to get this results without for loop. The first one uses `integers[1::3] = 0` and the second involves creating a boolean array that indexes the array: + +```python +mask = (integers+1)%3 == 0 +integers[mask] = 0 +``` - ``` - mask = (integers+1)%3 == 0 - integers[mask] = 0 - ``` --- + # Exercice 4 Random -The goal of this exercice is to learn to generate random data. -In Data Science it is extremely useful to generate random data for many reasons: -Lack of real data, create a random benchmark, use varied data sets. -NumPy proposes a lot of options to generate random data. In statistics, assumptions are made on the distribution the data is from. All data distribution that can be generated randomly are described in the documentation. In this exerice we will focus on two distributions: +The goal of this exercise is to learn to generate random data. +In Data Science it is extremely useful to generate random data for many reasons: +Lack of real data, create a random benchmark, use varied data sets. +NumPy proposes a lot of options to generate random data. In statistics, assumptions are made on the distribution the data is from. All data distribution that can be generated randomly are described in the documentation. In this exercise we will focus on two distributions: - Uniform: For example, if your goal is to generate a random number from 1 to 100 and that the probability that all the numbers is equal you'll need the uniform distribution. NumPy provides `randint` and `uniform` to generate uniform distribution + - Normal: The normal distribution is the most important probability distribution in statistics because it fits many natural phenomena.For example, if you need to generate a data sample that represents **Heights of 14 Year Old Girls** it can be done using the normal distribution. In that case, we need two parameters: the mean (1m51) and the standard deviation (0.0741m). NumPy provides `randn` to generate normal distribution (among other) + https://docs.scipy.org/doc/NumPy-1.15.0/reference/routines.random.html 1. Set the seed to 888 -2. Generate a **one-dimensional** array of size 100 with a normal distribution +2. Generate a **one-dimensional** array of size 100 with a normal distribution 3. Generate a **two-dimensional** array of size 8,8 with random integers from 1 to 10 - both included (same probability for each integer) 4. Generate a **three-dimensional** of size 4,2,5 array with random integers from 1 to 17 - both included (same probability for each integer) +## Correction -## Correction: -For this exercice, as the results may change depending on the version of the package or the OS, I give the code to correct the exercice. If the code is correct and the output is not the same as mine, it is accepted. +For this exercise, as the results may change depending on the version of the package or the OS, I give the code to correct the exercise. If the code is correct and the output is not the same as mine, it is accepted. 1. The solution is accepted if the solution is: `np.random.seed(888)` -2. The solution is accepted if the solution is `np.random.randn(100)`. The value of the first element is `0.17620087373662233`. +2. The solution is accepted if the solution is `np.random.randn(100)`. The value of the first element is `0.17620087373662233`. 3. The solution is accepted if the solution is `np.random.randint(1,11,(8,8))`. - ``` + ```console Given the NumPy version and the seed, you should have this output: array([[ 7, 4, 8, 10, 2, 1, 1, 10], @@ -141,10 +152,11 @@ For this exercice, as the results may change depending on the version of the pac [ 4, 1, 9, 7, 1, 4, 3, 5], [ 3, 2, 10, 8, 6, 3, 9, 4], [ 4, 4, 9, 2, 8, 5, 9, 5]]) - ``` + ``` + 4. The solution is accepted if the solution is `np.random.randint(1,18,(4,2,5))`. - ``` + ```console Given the NumPy version and the seed, you should have this output: array([[[14, 16, 8, 15, 14], @@ -158,52 +170,58 @@ For this exercice, as the results may change depending on the version of the pac [[ 3, 10, 5, 16, 13], [17, 12, 9, 7, 16]]]) - ``` + ``` + --- -# Exercice 5: Split, contenate, reshape arrays -The goal of this exercice is to learn to concatenate and reshape arrays. +# Exercice 5: Split, concatenate, reshape arrays + +The goal of this exercise is to learn to concatenate and reshape arrays. 1. Generate an array with integers from 1 to 50: `array([1,...,50])` + 2. Generate an array with integers from 51 to 100: `array([51,...,100])` 3. Using `np.concatenate`, concatenate the two arrays into: `array([1,...,100])` -4. Reshape the previous array into: - ``` +4. Reshape the previous array into: + + ```console array([[ 1, ... , 10], ... [ 91, ... , 100]]) ``` -## Correction: +## Correction -1. This question is validated if the generated array is based on an iterator as `range` or `np.arange`. Check that 50 is part of the array. +1. This question is validated if the generated array is based on an iterator as `range` or `np.arange`. Check that 50 is part of the array. -2. This question is validated if the generated array is based on an iterator as `range` or `np.arange`. Check that 100 is part of the array. +2. This question is validated if the generated array is based on an iterator as `range` or `np.arange`. Check that 100 is part of the array. 3. This question is validated if you concatenated this way `np.concatenate(array1,array2)`. 4. This question is validated if the result is: - ``` + ```console array([[ 1, ... , 10], ... [ 91, ... , 100]]) ``` - The easiest way is to use `array.reshape(10,10)`. + +The easiest way is to use `array.reshape(10,10)`. https://jakevdp.github.io/PythonDataScienceHandbook/02.02-the-basics-of-NumPy-arrays.html --- + # Exercice 6: Broadcasting and Slicing -The goal of this exercice is to learn to access values of n-dimensional arrays and efficiently. +The goal of this exercise is to learn to access values of n-dimensional arrays efficiently. 1. Create an 2-dimensional array size 9,9 of 1s. Each value has to be an `int8`. 2. Using **slicing**, output this array: - ``` + ```python array([[1, 1, 1, 1, 1, 1, 1, 1, 1], [1, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 1, 1, 1, 1, 1, 0, 1], @@ -217,14 +235,14 @@ The goal of this exercice is to learn to access values of n-dimensional arrays a https://jakevdp.github.io/PythonDataScienceHandbook/02.05-computation-on-arrays-broadcasting.html -## Correction +## Correction -1. The question is validated if the output is the same as: +1. The question is validated if the output is the same as: `np.ones([9,9], dtype=np.int8)` -2. The question is validated if the ouput is +2. The question is validated if the output is - ``` + ```console array([[1, 1, 1, 1, 1, 1, 1, 1, 1], [1, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 1, 1, 1, 1, 1, 0, 1], @@ -235,96 +253,109 @@ https://jakevdp.github.io/PythonDataScienceHandbook/02.05-computation-on-arrays- [1, 0, 0, 0, 0, 0, 0, 0, 1], [1, 1, 1, 1, 1, 1, 1, 1, 1]], dtype=int8) ``` - The solution is not accepted if the values of the array have been changed one by one manually. The usage of the for loop is not allowed neither. - Here is an example of solution: - ``` + The solution is not accepted if the values of the array have been changed one by one manually. The usage of the for loop is not allowed neither. + Here is an example of a possible solution: + + ```console x[1:8,1:8] = 0 x[2:7,2:7] = 1 x[3:6,3:6] = 0 x[4,4] = 1 ``` + --- -# Exercice 7: NaN -The goal of this exercice is to learn to deal with missing data in NumPy and to manipulate NumPy arrays. +# Exercice 7: NaN -Let us consider a 2-dimensional array that contains the grades at the past two exams. Some of the students missed the first exam. As the grade is missing it has been replaced with a NaN. +The goal of this exercise is to learn to deal with missing data in NumPy and to manipulate NumPy arrays. -1. Using `np.where` create a third column that is equal to the grade of the first exam if it exists and the second else. Add the column as the third column of the array. +Let us consider a 2-dimensional array that contains the grades at the past two exams. Some of the students missed the first exam. As the grade is missing it has been replaced with a `NaN`. -**Using a for loop or if/else statement is not allowed in this exercice.** +1. Using `np.where` create a third column that is equal to the grade of the first exam if it exists and the second else. Add the column as the third column of the array. -``` +**Using a for loop or if/else statement is not allowed in this exercise.** + +```python import numpy as np generator = np.random.default_rng(123) grades = np.round(generator.uniform(low = 0.0, high = 10.0, size = (10, 2))) grades[[1,2,5,7], [0,0,0,0]] = np.nan print(grades) +``` + +## Correction + +1. There are two steps in this exercise: + +- Create the vector that contains the grade of the first exam if available or the second. This can be done using `np.where`: +```python + np.where(np.isnan(grades[:, 0]), grades[:, 1], grades[:, 0]) ``` -## Correction +- Add this vector as third column of the array. Here are two ways: -1. There are two steps in this exercice: - - Create the vector that contains the the grade of the first exam if available or the second. This can be done using `np.where`: - ``` - np.where(np.isnan(grades[:, 0]), grades[:, 1], grades[:, 0]) - ``` - - Add this vector as third column of the array. Here are two ways: - ``` - np.insert(arr = grades, values = new_vector, axis = 1, obj = 2) +```python + np.insert(arr = grades, values = new_vector, axis = 1, obj = 2) - np.hstack((grades, new_vector[:, None])) - ``` - This question is validated if, without having used a for loop or having filled the array manually, the output is: + np.hstack((grades, new_vector[:, None])) +``` - ``` - [[ 7. 1. 7.] - [nan 2. 2.] - [nan 8. 8.] - [ 9. 3. 9.] - [ 8. 9. 8.] - [nan 2. 2.] - [ 8. 2. 8.] - [nan 6. 6.] - [ 9. 2. 9.] - [ 8. 5. 8.]] - ``` +This question is validated if, without having used a for loop or having filled the array manually, the output is: + This question is validated if, without having used a for loop or having filled the array manually, the output is: +This question is validated if, without having used a for loop or having filled the array manually, the output is: + +```console +[[ 7. 1. 7.] +[nan 2. 2.] +[nan 8. 8.] +[ 9. 3. 9.] +[ 8. 9. 8.] +[nan 2. 2.] +[ 8. 2. 8.] +[nan 6. 6.] +[ 9. 2. 9.] +[ 8. 5. 8.]] +``` https://jakevdp.github.io/PythonDataScienceHandbook/02.02-the-basics-of-NumPy-arrays.html --- -# Exercice 8: Wine -The goal of this exercice is to learn to perform a basic data analysis on real data using NumPy. +# Exercice 8: Wine -The data set that will be used for this exercice is the wine data set. -https://archive.ics.uci.edu/ml/datasets/wine+quality +The goal of this exercise is to learn to perform a basic data analysis on real data using NumPy. +The data set that will be used for this exercise is the wine data set. + +https://archive.ics.uci.edu/ml/datasets/wine+quality How to tell if a given 2D array has null columns? 1. Using `genfromtxt` load the data and reduce the size of the numpy array by optimizing the types. The sum of absolute differences between the original data set and the "memory" optimized one has to be smaller than 1.10**-3. I suggest to use `np.float32`. Check that the numpy array weights **76800 bytes**. -2. Print 2nd, 7th and 12th rows as a two dimensional array +2. Print 2nd, 7th and 12th rows as a two dimensional array + 3. Is there any wine with a percentage of alcohol greater than 20% ? Return True or False 4. What is the average % of alcohol on all wines in the data set ? If needed, drop `np.nan` values -5. Compute the minimum, the maximum, the 25th percentile, the 75 percentile, the median of the pH -6. Compute the average quality of the wines having the 20% least sulphates -7. Compute the mean of all variables for wines having the best quality. Same question for the wines having the worst quality -## Correction +5. Compute the minimum, the maximum, the 25th percentile, the 50th percentile, the 75 percentile, the median of the pH -1. This question is validated if the text file has successfully been loaded in a NumPy array with - ` genfromtxt('winequality-red.csv', delimiter=',')` and the reduced arrays weights **76800 bytes** +6. Compute the average quality of the wines having the 20% least sulphates - 2. This question is validated if the output is +7. Compute the mean of all variables for wines having the best quality. Same question for the wines having the worst quality +## Correction - ``` +1. This question is validated if the text file has successfully been loaded in a NumPy array with + `genfromtxt('winequality-red.csv', delimiter=',')` and the reduced arrays weights **76800 bytes** + +2. This question is validated if the output is + + ```python array([[ 7.4 , 0.7 , 0. , 1.9 , 0.076 , 11. , 34. , 0.9978, 3.51 , 0.56 , 9.4 , 5. ], [ 7.4 , 0.66 , 0. , 1.8 , 0.075 , 13. , 40. , @@ -332,15 +363,16 @@ How to tell if a given 2D array has null columns? [ 6.7 , 0.58 , 0.08 , 1.8 , 0.097 , 15. , 65. , 0.9959, 3.28 , 0.54 , 9.2 , 5. ]]) ``` - This slicing gives the answer `my_data[[1,6,11],:]`. -3. This question is validated if the answer if False. There many ways to get the answer: find the maximum or check values greater than 20. +This slicing gives the answer `my_data[[1,6,11],:]`. + +3. This question is validated if the answer if False. There many ways to get the answer: find the maximum or check values greater than 20. -4. This question is validated if the answer is 10.422983114446529. +4. This question is validated if the answer is 10.422983114446529. 5. This question is validated if the answers is: - ``` + ```console pH stats 25 percentile: 3.21 50 percentile: 3.31 @@ -349,58 +381,60 @@ How to tell if a given 2D array has null columns? min: 2.74 max: 4.01 ``` - *Note: Using `percentile` or `median` may give different results depending on the duplicate values in the column. If you do not have my results please use `percentile`.* -6. This question is validated if the answer is `5.222222222222222`. The first step is to get the percentile 20% of the column `sulphates`, then create a boolean array that contains `True` of the value is smaller than the percentile 20%, then select this rows with the column quality and compute the `mean`. -7. This question is validated if the output for the best wines is: + > *Note: Using `percentile` or `median` may give different results depending on the duplicate values in the column. If you do not have my results please use `percentile`.* - ``` - array([ 8.56666667, 0.42333333, 0.39111111, 2.57777778, 0.06844444, - 13.27777778, 33.44444444, 0.99521222, 3.26722222, 0.76777778, - 12.09444444, 8. ]) - ``` +6. This question is validated if the answer is `5.222222222222222`. The first step is to get the percentile 20% of the column `sulphates`, then create a boolean array that contains `True` of the value is smaller than the percentile 20%, then select this rows with the column quality and compute the `mean`. -And the output for the bad wines is: +7. This question is validated if the output for the best wines is: +```python +array([ 8.56666667, 0.42333333, 0.39111111, 2.57777778, 0.06844444, + 13.27777778, 33.44444444, 0.99521222, 3.26722222, 0.76777778, + 12.09444444, 8. ]) +``` - ``` - array([ 8.36 , 0.8845 , 0.171 , 2.635 , 0.1225 , 11. , - 24.9 , 0.997464, 3.398 , 0.57 , 9.955 , 3. ]) - ``` +And the output for the bad wines is: -This can be done in three steps: Get the max, create a boolean mask that indicates rows with max quality, use this mask to subset the rows with the best quality and compute the mean on the axis 0. +```python +array([ 8.36 , 0.8845 , 0.171 , 2.635 , 0.1225 , 11. , + 24.9 , 0.997464, 3.398 , 0.57 , 9.955 , 3. ]) +``` + +This can be done in three steps: Get the max, create a boolean mask that indicates rows with max quality, use this mask to subset the rows with the best quality and compute the mean on the axis 0. --- + ## Exercice 9 Football tournament -The goal of this exercice is to learn to use permutations, complex +The goal of this exercise is to learn to use permutations, complex -A Football tournament is organized in your city. There are 10 teams and the director of the tournaments wants you to create a first round as exciting as possible. To do so, you are allowed to choose the pairs. As a former data scientist, you implemented a model based on teams' current season performance. This models predicts the score difference between two teams. You used this algorithm to predict the score difference for every possible pair. -The matrix returned is a 2-dimensional array that contains in (i,j) the score difference between team i and j. The matrix is in `model_forecasts.txt`. +A Football tournament is organized in your city. There are 10 teams and the director of the tournaments wants you to create a first round as exciting as possible. To do so, you are allowed to choose the pairs. As a former data scientist, you implemented a model based on teams' current season performance. This models predicts the score difference between two teams. You used this algorithm to predict the score difference for every possible pair. +The matrix returned is a 2-dimensional array that contains in (i,j) the score difference between team i and j. The matrix is in `model_forecasts.txt`. -Using this output, what are the pairs that will give the most intersting matches ? +Using this output, what are the pairs that will give the most interesting matches ? -If a team wins 7-1 the match is obviously less exciting than a match where the winner wins 2-1. -The criteria that correponds to **the pairs that will give the most intersting matches** is **the pairs that minimize the sum of squared differences** +If a team wins 7-1 the match is obviously less exciting than a match where the winner wins 2-1. +The criteria that corresponds to **the pairs that will give the most interesting matches** is **the pairs that minimize the sum of squared differences** -The expected output is: +The expected output is: -``` +```console [[m1_t1 m2_t1 m3_t1 m4_t1 m5_t1] [m1_t2 m2_t2 m3_t2 m4_t2 m5_t2]] ``` - - m1_t1 stands for match1_team1 - - m1_t1 plays against m1_t2 ... +- m1_t1 stands for match1_team1 +- m1_t1 plays against m1_t2 ... **Usage of for loop is not allowed, you may need to use the library** `itertools` **to create permutations** ## Correction -This exercice is validated if the output is: +This exercise is validated if the output is: -``` +```console [[0 3 1 2 4] [7 6 8 9 5]] ``` From 558e97e4dbdb6f8a2d30763b19e23fe9415a4ea2 Mon Sep 17 00:00:00 2001 From: "b.ghazlane" Date: Thu, 8 Apr 2021 00:05:30 +0200 Subject: [PATCH 2/4] fix: correct the ressources URL --- one_md_per_day_format/piscine/Week1/day1.md | 9 ++++----- 1 file changed, 4 insertions(+), 5 deletions(-) diff --git a/one_md_per_day_format/piscine/Week1/day1.md b/one_md_per_day_format/piscine/Week1/day1.md index b63d92b..872ff54 100644 --- a/one_md_per_day_format/piscine/Week1/day1.md +++ b/one_md_per_day_format/piscine/Week1/day1.md @@ -26,7 +26,7 @@ Save one notebook per day or one per exercise. Use markdown to divide your noteb ## Ressources - https://medium.com/fintechexplained/why-should-we-use-NumPy-c14a4fb03ee9 -- https://docs.scipy.org/doc/NumPy-1.15.0/reference/ +- https://numpy.org/doc/ - https://jakevdp.github.io/PythonDataScienceHandbook/ # Exercice 1 Your first NumPy array @@ -124,7 +124,7 @@ NumPy proposes a lot of options to generate random data. In statistics, assumpti - Normal: The normal distribution is the most important probability distribution in statistics because it fits many natural phenomena.For example, if you need to generate a data sample that represents **Heights of 14 Year Old Girls** it can be done using the normal distribution. In that case, we need two parameters: the mean (1m51) and the standard deviation (0.0741m). NumPy provides `randn` to generate normal distribution (among other) -https://docs.scipy.org/doc/NumPy-1.15.0/reference/routines.random.html +https://numpy.org/doc/stable/reference/random/generator.html 1. Set the seed to 888 2. Generate a **one-dimensional** array of size 100 with a normal distribution @@ -210,7 +210,7 @@ The goal of this exercise is to learn to concatenate and reshape arrays. The easiest way is to use `array.reshape(10,10)`. -https://jakevdp.github.io/PythonDataScienceHandbook/02.02-the-basics-of-NumPy-arrays.html +https://jakevdp.github.io/PythonDataScienceHandbook/ (section: The Basics of NumPy Arrays) --- @@ -233,7 +233,7 @@ The goal of this exercise is to learn to access values of n-dimensional arrays e [1, 1, 1, 1, 1, 1, 1, 1, 1]], dtype=int8) ``` -https://jakevdp.github.io/PythonDataScienceHandbook/02.05-computation-on-arrays-broadcasting.html +https://jakevdp.github.io/PythonDataScienceHandbook/ (section: Computation on Arrays: Broadcasting) ## Correction @@ -320,7 +320,6 @@ This question is validated if, without having used a for loop or having filled t [ 8. 5. 8.]] ``` -https://jakevdp.github.io/PythonDataScienceHandbook/02.02-the-basics-of-NumPy-arrays.html --- From a459f3dc1e146e010edabfbe782aa08bf4512bac Mon Sep 17 00:00:00 2001 From: "b.ghazlane" Date: Thu, 8 Apr 2021 00:30:10 +0200 Subject: [PATCH 3/4] fix: add ex 8 data, fix ex8 correction, add itertools link --- .../Week1/data/D01/ex8/winequality-red.csv | 1600 +++++++++++++++++ .../Week1/data/D01/ex8/winequality.names | 72 + .../data/D01/{ex6 => ex9}/model_forecasts.txt | 0 one_md_per_day_format/piscine/Week1/day1.md | 8 +- 4 files changed, 1677 insertions(+), 3 deletions(-) create mode 100644 one_md_per_day_format/piscine/Week1/data/D01/ex8/winequality-red.csv create mode 100644 one_md_per_day_format/piscine/Week1/data/D01/ex8/winequality.names rename one_md_per_day_format/piscine/Week1/data/D01/{ex6 => ex9}/model_forecasts.txt (100%) diff --git a/one_md_per_day_format/piscine/Week1/data/D01/ex8/winequality-red.csv b/one_md_per_day_format/piscine/Week1/data/D01/ex8/winequality-red.csv new file mode 100644 index 0000000..9bb4e3c --- /dev/null +++ b/one_md_per_day_format/piscine/Week1/data/D01/ex8/winequality-red.csv @@ -0,0 +1,1600 @@ +"fixed acidity";"volatile acidity";"citric acid";"residual sugar";"chlorides";"free sulfur dioxide";"total sulfur dioxide";"density";"pH";"sulphates";"alcohol";"quality" +7.4;0.7;0;1.9;0.076;11;34;0.9978;3.51;0.56;9.4;5 +7.8;0.88;0;2.6;0.098;25;67;0.9968;3.2;0.68;9.8;5 +7.8;0.76;0.04;2.3;0.092;15;54;0.997;3.26;0.65;9.8;5 +11.2;0.28;0.56;1.9;0.075;17;60;0.998;3.16;0.58;9.8;6 +7.4;0.7;0;1.9;0.076;11;34;0.9978;3.51;0.56;9.4;5 +7.4;0.66;0;1.8;0.075;13;40;0.9978;3.51;0.56;9.4;5 +7.9;0.6;0.06;1.6;0.069;15;59;0.9964;3.3;0.46;9.4;5 +7.3;0.65;0;1.2;0.065;15;21;0.9946;3.39;0.47;10;7 +7.8;0.58;0.02;2;0.073;9;18;0.9968;3.36;0.57;9.5;7 +7.5;0.5;0.36;6.1;0.071;17;102;0.9978;3.35;0.8;10.5;5 +6.7;0.58;0.08;1.8;0.097;15;65;0.9959;3.28;0.54;9.2;5 +7.5;0.5;0.36;6.1;0.071;17;102;0.9978;3.35;0.8;10.5;5 +5.6;0.615;0;1.6;0.089;16;59;0.9943;3.58;0.52;9.9;5 +7.8;0.61;0.29;1.6;0.114;9;29;0.9974;3.26;1.56;9.1;5 +8.9;0.62;0.18;3.8;0.176;52;145;0.9986;3.16;0.88;9.2;5 +8.9;0.62;0.19;3.9;0.17;51;148;0.9986;3.17;0.93;9.2;5 +8.5;0.28;0.56;1.8;0.092;35;103;0.9969;3.3;0.75;10.5;7 +8.1;0.56;0.28;1.7;0.368;16;56;0.9968;3.11;1.28;9.3;5 +7.4;0.59;0.08;4.4;0.086;6;29;0.9974;3.38;0.5;9;4 +7.9;0.32;0.51;1.8;0.341;17;56;0.9969;3.04;1.08;9.2;6 +8.9;0.22;0.48;1.8;0.077;29;60;0.9968;3.39;0.53;9.4;6 +7.6;0.39;0.31;2.3;0.082;23;71;0.9982;3.52;0.65;9.7;5 +7.9;0.43;0.21;1.6;0.106;10;37;0.9966;3.17;0.91;9.5;5 +8.5;0.49;0.11;2.3;0.084;9;67;0.9968;3.17;0.53;9.4;5 +6.9;0.4;0.14;2.4;0.085;21;40;0.9968;3.43;0.63;9.7;6 +6.3;0.39;0.16;1.4;0.08;11;23;0.9955;3.34;0.56;9.3;5 +7.6;0.41;0.24;1.8;0.08;4;11;0.9962;3.28;0.59;9.5;5 +7.9;0.43;0.21;1.6;0.106;10;37;0.9966;3.17;0.91;9.5;5 +7.1;0.71;0;1.9;0.08;14;35;0.9972;3.47;0.55;9.4;5 +7.8;0.645;0;2;0.082;8;16;0.9964;3.38;0.59;9.8;6 +6.7;0.675;0.07;2.4;0.089;17;82;0.9958;3.35;0.54;10.1;5 +6.9;0.685;0;2.5;0.105;22;37;0.9966;3.46;0.57;10.6;6 +8.3;0.655;0.12;2.3;0.083;15;113;0.9966;3.17;0.66;9.8;5 +6.9;0.605;0.12;10.7;0.073;40;83;0.9993;3.45;0.52;9.4;6 +5.2;0.32;0.25;1.8;0.103;13;50;0.9957;3.38;0.55;9.2;5 +7.8;0.645;0;5.5;0.086;5;18;0.9986;3.4;0.55;9.6;6 +7.8;0.6;0.14;2.4;0.086;3;15;0.9975;3.42;0.6;10.8;6 +8.1;0.38;0.28;2.1;0.066;13;30;0.9968;3.23;0.73;9.7;7 +5.7;1.13;0.09;1.5;0.172;7;19;0.994;3.5;0.48;9.8;4 +7.3;0.45;0.36;5.9;0.074;12;87;0.9978;3.33;0.83;10.5;5 +7.3;0.45;0.36;5.9;0.074;12;87;0.9978;3.33;0.83;10.5;5 +8.8;0.61;0.3;2.8;0.088;17;46;0.9976;3.26;0.51;9.3;4 +7.5;0.49;0.2;2.6;0.332;8;14;0.9968;3.21;0.9;10.5;6 +8.1;0.66;0.22;2.2;0.069;9;23;0.9968;3.3;1.2;10.3;5 +6.8;0.67;0.02;1.8;0.05;5;11;0.9962;3.48;0.52;9.5;5 +4.6;0.52;0.15;2.1;0.054;8;65;0.9934;3.9;0.56;13.1;4 +7.7;0.935;0.43;2.2;0.114;22;114;0.997;3.25;0.73;9.2;5 +8.7;0.29;0.52;1.6;0.113;12;37;0.9969;3.25;0.58;9.5;5 +6.4;0.4;0.23;1.6;0.066;5;12;0.9958;3.34;0.56;9.2;5 +5.6;0.31;0.37;1.4;0.074;12;96;0.9954;3.32;0.58;9.2;5 +8.8;0.66;0.26;1.7;0.074;4;23;0.9971;3.15;0.74;9.2;5 +6.6;0.52;0.04;2.2;0.069;8;15;0.9956;3.4;0.63;9.4;6 +6.6;0.5;0.04;2.1;0.068;6;14;0.9955;3.39;0.64;9.4;6 +8.6;0.38;0.36;3;0.081;30;119;0.997;3.2;0.56;9.4;5 +7.6;0.51;0.15;2.8;0.11;33;73;0.9955;3.17;0.63;10.2;6 +7.7;0.62;0.04;3.8;0.084;25;45;0.9978;3.34;0.53;9.5;5 +10.2;0.42;0.57;3.4;0.07;4;10;0.9971;3.04;0.63;9.6;5 +7.5;0.63;0.12;5.1;0.111;50;110;0.9983;3.26;0.77;9.4;5 +7.8;0.59;0.18;2.3;0.076;17;54;0.9975;3.43;0.59;10;5 +7.3;0.39;0.31;2.4;0.074;9;46;0.9962;3.41;0.54;9.4;6 +8.8;0.4;0.4;2.2;0.079;19;52;0.998;3.44;0.64;9.2;5 +7.7;0.69;0.49;1.8;0.115;20;112;0.9968;3.21;0.71;9.3;5 +7.5;0.52;0.16;1.9;0.085;12;35;0.9968;3.38;0.62;9.5;7 +7;0.735;0.05;2;0.081;13;54;0.9966;3.39;0.57;9.8;5 +7.2;0.725;0.05;4.65;0.086;4;11;0.9962;3.41;0.39;10.9;5 +7.2;0.725;0.05;4.65;0.086;4;11;0.9962;3.41;0.39;10.9;5 +7.5;0.52;0.11;1.5;0.079;11;39;0.9968;3.42;0.58;9.6;5 +6.6;0.705;0.07;1.6;0.076;6;15;0.9962;3.44;0.58;10.7;5 +9.3;0.32;0.57;2;0.074;27;65;0.9969;3.28;0.79;10.7;5 +8;0.705;0.05;1.9;0.074;8;19;0.9962;3.34;0.95;10.5;6 +7.7;0.63;0.08;1.9;0.076;15;27;0.9967;3.32;0.54;9.5;6 +7.7;0.67;0.23;2.1;0.088;17;96;0.9962;3.32;0.48;9.5;5 +7.7;0.69;0.22;1.9;0.084;18;94;0.9961;3.31;0.48;9.5;5 +8.3;0.675;0.26;2.1;0.084;11;43;0.9976;3.31;0.53;9.2;4 +9.7;0.32;0.54;2.5;0.094;28;83;0.9984;3.28;0.82;9.6;5 +8.8;0.41;0.64;2.2;0.093;9;42;0.9986;3.54;0.66;10.5;5 +8.8;0.41;0.64;2.2;0.093;9;42;0.9986;3.54;0.66;10.5;5 +6.8;0.785;0;2.4;0.104;14;30;0.9966;3.52;0.55;10.7;6 +6.7;0.75;0.12;2;0.086;12;80;0.9958;3.38;0.52;10.1;5 +8.3;0.625;0.2;1.5;0.08;27;119;0.9972;3.16;1.12;9.1;4 +6.2;0.45;0.2;1.6;0.069;3;15;0.9958;3.41;0.56;9.2;5 +7.8;0.43;0.7;1.9;0.464;22;67;0.9974;3.13;1.28;9.4;5 +7.4;0.5;0.47;2;0.086;21;73;0.997;3.36;0.57;9.1;5 +7.3;0.67;0.26;1.8;0.401;16;51;0.9969;3.16;1.14;9.4;5 +6.3;0.3;0.48;1.8;0.069;18;61;0.9959;3.44;0.78;10.3;6 +6.9;0.55;0.15;2.2;0.076;19;40;0.9961;3.41;0.59;10.1;5 +8.6;0.49;0.28;1.9;0.11;20;136;0.9972;2.93;1.95;9.9;6 +7.7;0.49;0.26;1.9;0.062;9;31;0.9966;3.39;0.64;9.6;5 +9.3;0.39;0.44;2.1;0.107;34;125;0.9978;3.14;1.22;9.5;5 +7;0.62;0.08;1.8;0.076;8;24;0.9978;3.48;0.53;9;5 +7.9;0.52;0.26;1.9;0.079;42;140;0.9964;3.23;0.54;9.5;5 +8.6;0.49;0.28;1.9;0.11;20;136;0.9972;2.93;1.95;9.9;6 +8.6;0.49;0.29;2;0.11;19;133;0.9972;2.93;1.98;9.8;5 +7.7;0.49;0.26;1.9;0.062;9;31;0.9966;3.39;0.64;9.6;5 +5;1.02;0.04;1.4;0.045;41;85;0.9938;3.75;0.48;10.5;4 +4.7;0.6;0.17;2.3;0.058;17;106;0.9932;3.85;0.6;12.9;6 +6.8;0.775;0;3;0.102;8;23;0.9965;3.45;0.56;10.7;5 +7;0.5;0.25;2;0.07;3;22;0.9963;3.25;0.63;9.2;5 +7.6;0.9;0.06;2.5;0.079;5;10;0.9967;3.39;0.56;9.8;5 +8.1;0.545;0.18;1.9;0.08;13;35;0.9972;3.3;0.59;9;6 +8.3;0.61;0.3;2.1;0.084;11;50;0.9972;3.4;0.61;10.2;6 +7.8;0.5;0.3;1.9;0.075;8;22;0.9959;3.31;0.56;10.4;6 +8.1;0.545;0.18;1.9;0.08;13;35;0.9972;3.3;0.59;9;6 +8.1;0.575;0.22;2.1;0.077;12;65;0.9967;3.29;0.51;9.2;5 +7.2;0.49;0.24;2.2;0.07;5;36;0.996;3.33;0.48;9.4;5 +8.1;0.575;0.22;2.1;0.077;12;65;0.9967;3.29;0.51;9.2;5 +7.8;0.41;0.68;1.7;0.467;18;69;0.9973;3.08;1.31;9.3;5 +6.2;0.63;0.31;1.7;0.088;15;64;0.9969;3.46;0.79;9.3;5 +8;0.33;0.53;2.5;0.091;18;80;0.9976;3.37;0.8;9.6;6 +8.1;0.785;0.52;2;0.122;37;153;0.9969;3.21;0.69;9.3;5 +7.8;0.56;0.19;1.8;0.104;12;47;0.9964;3.19;0.93;9.5;5 +8.4;0.62;0.09;2.2;0.084;11;108;0.9964;3.15;0.66;9.8;5 +8.4;0.6;0.1;2.2;0.085;14;111;0.9964;3.15;0.66;9.8;5 +10.1;0.31;0.44;2.3;0.08;22;46;0.9988;3.32;0.67;9.7;6 +7.8;0.56;0.19;1.8;0.104;12;47;0.9964;3.19;0.93;9.5;5 +9.4;0.4;0.31;2.2;0.09;13;62;0.9966;3.07;0.63;10.5;6 +8.3;0.54;0.28;1.9;0.077;11;40;0.9978;3.39;0.61;10;6 +7.8;0.56;0.12;2;0.082;7;28;0.997;3.37;0.5;9.4;6 +8.8;0.55;0.04;2.2;0.119;14;56;0.9962;3.21;0.6;10.9;6 +7;0.69;0.08;1.8;0.097;22;89;0.9959;3.34;0.54;9.2;6 +7.3;1.07;0.09;1.7;0.178;10;89;0.9962;3.3;0.57;9;5 +8.8;0.55;0.04;2.2;0.119;14;56;0.9962;3.21;0.6;10.9;6 +7.3;0.695;0;2.5;0.075;3;13;0.998;3.49;0.52;9.2;5 +8;0.71;0;2.6;0.08;11;34;0.9976;3.44;0.53;9.5;5 +7.8;0.5;0.17;1.6;0.082;21;102;0.996;3.39;0.48;9.5;5 +9;0.62;0.04;1.9;0.146;27;90;0.9984;3.16;0.7;9.4;5 +8.2;1.33;0;1.7;0.081;3;12;0.9964;3.53;0.49;10.9;5 +8.1;1.33;0;1.8;0.082;3;12;0.9964;3.54;0.48;10.9;5 +8;0.59;0.16;1.8;0.065;3;16;0.9962;3.42;0.92;10.5;7 +6.1;0.38;0.15;1.8;0.072;6;19;0.9955;3.42;0.57;9.4;5 +8;0.745;0.56;2;0.118;30;134;0.9968;3.24;0.66;9.4;5 +5.6;0.5;0.09;2.3;0.049;17;99;0.9937;3.63;0.63;13;5 +5.6;0.5;0.09;2.3;0.049;17;99;0.9937;3.63;0.63;13;5 +6.6;0.5;0.01;1.5;0.06;17;26;0.9952;3.4;0.58;9.8;6 +7.9;1.04;0.05;2.2;0.084;13;29;0.9959;3.22;0.55;9.9;6 +8.4;0.745;0.11;1.9;0.09;16;63;0.9965;3.19;0.82;9.6;5 +8.3;0.715;0.15;1.8;0.089;10;52;0.9968;3.23;0.77;9.5;5 +7.2;0.415;0.36;2;0.081;13;45;0.9972;3.48;0.64;9.2;5 +7.8;0.56;0.19;2.1;0.081;15;105;0.9962;3.33;0.54;9.5;5 +7.8;0.56;0.19;2;0.081;17;108;0.9962;3.32;0.54;9.5;5 +8.4;0.745;0.11;1.9;0.09;16;63;0.9965;3.19;0.82;9.6;5 +8.3;0.715;0.15;1.8;0.089;10;52;0.9968;3.23;0.77;9.5;5 +5.2;0.34;0;1.8;0.05;27;63;0.9916;3.68;0.79;14;6 +6.3;0.39;0.08;1.7;0.066;3;20;0.9954;3.34;0.58;9.4;5 +5.2;0.34;0;1.8;0.05;27;63;0.9916;3.68;0.79;14;6 +8.1;0.67;0.55;1.8;0.117;32;141;0.9968;3.17;0.62;9.4;5 +5.8;0.68;0.02;1.8;0.087;21;94;0.9944;3.54;0.52;10;5 +7.6;0.49;0.26;1.6;0.236;10;88;0.9968;3.11;0.8;9.3;5 +6.9;0.49;0.1;2.3;0.074;12;30;0.9959;3.42;0.58;10.2;6 +8.2;0.4;0.44;2.8;0.089;11;43;0.9975;3.53;0.61;10.5;6 +7.3;0.33;0.47;2.1;0.077;5;11;0.9958;3.33;0.53;10.3;6 +9.2;0.52;1;3.4;0.61;32;69;0.9996;2.74;2.0;9.4;4 +7.5;0.6;0.03;1.8;0.095;25;99;0.995;3.35;0.54;10.1;5 +7.5;0.6;0.03;1.8;0.095;25;99;0.995;3.35;0.54;10.1;5 +7.1;0.43;0.42;5.5;0.07;29;129;0.9973;3.42;0.72;10.5;5 +7.1;0.43;0.42;5.5;0.071;28;128;0.9973;3.42;0.71;10.5;5 +7.1;0.43;0.42;5.5;0.07;29;129;0.9973;3.42;0.72;10.5;5 +7.1;0.43;0.42;5.5;0.071;28;128;0.9973;3.42;0.71;10.5;5 +7.1;0.68;0;2.2;0.073;12;22;0.9969;3.48;0.5;9.3;5 +6.8;0.6;0.18;1.9;0.079;18;86;0.9968;3.59;0.57;9.3;6 +7.6;0.95;0.03;2;0.09;7;20;0.9959;3.2;0.56;9.6;5 +7.6;0.68;0.02;1.3;0.072;9;20;0.9965;3.17;1.08;9.2;4 +7.8;0.53;0.04;1.7;0.076;17;31;0.9964;3.33;0.56;10;6 +7.4;0.6;0.26;7.3;0.07;36;121;0.9982;3.37;0.49;9.4;5 +7.3;0.59;0.26;7.2;0.07;35;121;0.9981;3.37;0.49;9.4;5 +7.8;0.63;0.48;1.7;0.1;14;96;0.9961;3.19;0.62;9.5;5 +6.8;0.64;0.1;2.1;0.085;18;101;0.9956;3.34;0.52;10.2;5 +7.3;0.55;0.03;1.6;0.072;17;42;0.9956;3.37;0.48;9;4 +6.8;0.63;0.07;2.1;0.089;11;44;0.9953;3.47;0.55;10.4;6 +7.5;0.705;0.24;1.8;0.36;15;63;0.9964;3;1.59;9.5;5 +7.9;0.885;0.03;1.8;0.058;4;8;0.9972;3.36;0.33;9.1;4 +8;0.42;0.17;2;0.073;6;18;0.9972;3.29;0.61;9.2;6 +8;0.42;0.17;2;0.073;6;18;0.9972;3.29;0.61;9.2;6 +7.4;0.62;0.05;1.9;0.068;24;42;0.9961;3.42;0.57;11.5;6 +7.3;0.38;0.21;2;0.08;7;35;0.9961;3.33;0.47;9.5;5 +6.9;0.5;0.04;1.5;0.085;19;49;0.9958;3.35;0.78;9.5;5 +7.3;0.38;0.21;2;0.08;7;35;0.9961;3.33;0.47;9.5;5 +7.5;0.52;0.42;2.3;0.087;8;38;0.9972;3.58;0.61;10.5;6 +7;0.805;0;2.5;0.068;7;20;0.9969;3.48;0.56;9.6;5 +8.8;0.61;0.14;2.4;0.067;10;42;0.9969;3.19;0.59;9.5;5 +8.8;0.61;0.14;2.4;0.067;10;42;0.9969;3.19;0.59;9.5;5 +8.9;0.61;0.49;2;0.27;23;110;0.9972;3.12;1.02;9.3;5 +7.2;0.73;0.02;2.5;0.076;16;42;0.9972;3.44;0.52;9.3;5 +6.8;0.61;0.2;1.8;0.077;11;65;0.9971;3.54;0.58;9.3;5 +6.7;0.62;0.21;1.9;0.079;8;62;0.997;3.52;0.58;9.3;6 +8.9;0.31;0.57;2;0.111;26;85;0.9971;3.26;0.53;9.7;5 +7.4;0.39;0.48;2;0.082;14;67;0.9972;3.34;0.55;9.2;5 +7.7;0.705;0.1;2.6;0.084;9;26;0.9976;3.39;0.49;9.7;5 +7.9;0.5;0.33;2;0.084;15;143;0.9968;3.2;0.55;9.5;5 +7.9;0.49;0.32;1.9;0.082;17;144;0.9968;3.2;0.55;9.5;5 +8.2;0.5;0.35;2.9;0.077;21;127;0.9976;3.23;0.62;9.4;5 +6.4;0.37;0.25;1.9;0.074;21;49;0.9974;3.57;0.62;9.8;6 +6.8;0.63;0.12;3.8;0.099;16;126;0.9969;3.28;0.61;9.5;5 +7.6;0.55;0.21;2.2;0.071;7;28;0.9964;3.28;0.55;9.7;5 +7.6;0.55;0.21;2.2;0.071;7;28;0.9964;3.28;0.55;9.7;5 +7.8;0.59;0.33;2;0.074;24;120;0.9968;3.25;0.54;9.4;5 +7.3;0.58;0.3;2.4;0.074;15;55;0.9968;3.46;0.59;10.2;5 +11.5;0.3;0.6;2;0.067;12;27;0.9981;3.11;0.97;10.1;6 +5.4;0.835;0.08;1.2;0.046;13;93;0.9924;3.57;0.85;13;7 +6.9;1.09;0.06;2.1;0.061;12;31;0.9948;3.51;0.43;11.4;4 +9.6;0.32;0.47;1.4;0.056;9;24;0.99695;3.22;0.82;10.3;7 +8.8;0.37;0.48;2.1;0.097;39;145;0.9975;3.04;1.03;9.3;5 +6.8;0.5;0.11;1.5;0.075;16;49;0.99545;3.36;0.79;9.5;5 +7;0.42;0.35;1.6;0.088;16;39;0.9961;3.34;0.55;9.2;5 +7;0.43;0.36;1.6;0.089;14;37;0.99615;3.34;0.56;9.2;6 +12.8;0.3;0.74;2.6;0.095;9;28;0.9994;3.2;0.77;10.8;7 +12.8;0.3;0.74;2.6;0.095;9;28;0.9994;3.2;0.77;10.8;7 +7.8;0.57;0.31;1.8;0.069;26;120;0.99625;3.29;0.53;9.3;5 +7.8;0.44;0.28;2.7;0.1;18;95;0.9966;3.22;0.67;9.4;5 +11;0.3;0.58;2.1;0.054;7;19;0.998;3.31;0.88;10.5;7 +9.7;0.53;0.6;2;0.039;5;19;0.99585;3.3;0.86;12.4;6 +8;0.725;0.24;2.8;0.083;10;62;0.99685;3.35;0.56;10;6 +11.6;0.44;0.64;2.1;0.059;5;15;0.998;3.21;0.67;10.2;6 +8.2;0.57;0.26;2.2;0.06;28;65;0.9959;3.3;0.43;10.1;5 +7.8;0.735;0.08;2.4;0.092;10;41;0.9974;3.24;0.71;9.8;6 +7;0.49;0.49;5.6;0.06;26;121;0.9974;3.34;0.76;10.5;5 +8.7;0.625;0.16;2;0.101;13;49;0.9962;3.14;0.57;11;5 +8.1;0.725;0.22;2.2;0.072;11;41;0.9967;3.36;0.55;9.1;5 +7.5;0.49;0.19;1.9;0.076;10;44;0.9957;3.39;0.54;9.7;5 +7.8;0.53;0.33;2.4;0.08;24;144;0.99655;3.3;0.6;9.5;5 +7.8;0.34;0.37;2;0.082;24;58;0.9964;3.34;0.59;9.4;6 +7.4;0.53;0.26;2;0.101;16;72;0.9957;3.15;0.57;9.4;5 +6.8;0.61;0.04;1.5;0.057;5;10;0.99525;3.42;0.6;9.5;5 +8.6;0.645;0.25;2;0.083;8;28;0.99815;3.28;0.6;10;6 +8.4;0.635;0.36;2;0.089;15;55;0.99745;3.31;0.57;10.4;4 +7.7;0.43;0.25;2.6;0.073;29;63;0.99615;3.37;0.58;10.5;6 +8.9;0.59;0.5;2;0.337;27;81;0.9964;3.04;1.61;9.5;6 +9;0.82;0.14;2.6;0.089;9;23;0.9984;3.39;0.63;9.8;5 +7.7;0.43;0.25;2.6;0.073;29;63;0.99615;3.37;0.58;10.5;6 +6.9;0.52;0.25;2.6;0.081;10;37;0.99685;3.46;0.5;11;5 +5.2;0.48;0.04;1.6;0.054;19;106;0.9927;3.54;0.62;12.2;7 +8;0.38;0.06;1.8;0.078;12;49;0.99625;3.37;0.52;9.9;6 +8.5;0.37;0.2;2.8;0.09;18;58;0.998;3.34;0.7;9.6;6 +6.9;0.52;0.25;2.6;0.081;10;37;0.99685;3.46;0.5;11;5 +8.2;1;0.09;2.3;0.065;7;37;0.99685;3.32;0.55;9;6 +7.2;0.63;0;1.9;0.097;14;38;0.99675;3.37;0.58;9;6 +7.2;0.63;0;1.9;0.097;14;38;0.99675;3.37;0.58;9;6 +7.2;0.645;0;1.9;0.097;15;39;0.99675;3.37;0.58;9.2;6 +7.2;0.63;0;1.9;0.097;14;38;0.99675;3.37;0.58;9;6 +8.2;1;0.09;2.3;0.065;7;37;0.99685;3.32;0.55;9;6 +8.9;0.635;0.37;1.7;0.263;5;62;0.9971;3;1.09;9.3;5 +12;0.38;0.56;2.1;0.093;6;24;0.99925;3.14;0.71;10.9;6 +7.7;0.58;0.1;1.8;0.102;28;109;0.99565;3.08;0.49;9.8;6 +15;0.21;0.44;2.2;0.075;10;24;1.00005;3.07;0.84;9.2;7 +15;0.21;0.44;2.2;0.075;10;24;1.00005;3.07;0.84;9.2;7 +7.3;0.66;0;2;0.084;6;23;0.9983;3.61;0.96;9.9;6 +7.1;0.68;0.07;1.9;0.075;16;51;0.99685;3.38;0.52;9.5;5 +8.2;0.6;0.17;2.3;0.072;11;73;0.9963;3.2;0.45;9.3;5 +7.7;0.53;0.06;1.7;0.074;9;39;0.99615;3.35;0.48;9.8;6 +7.3;0.66;0;2;0.084;6;23;0.9983;3.61;0.96;9.9;6 +10.8;0.32;0.44;1.6;0.063;16;37;0.9985;3.22;0.78;10;6 +7.1;0.6;0;1.8;0.074;16;34;0.9972;3.47;0.7;9.9;6 +11.1;0.35;0.48;3.1;0.09;5;21;0.9986;3.17;0.53;10.5;5 +7.7;0.775;0.42;1.9;0.092;8;86;0.9959;3.23;0.59;9.5;5 +7.1;0.6;0;1.8;0.074;16;34;0.9972;3.47;0.7;9.9;6 +8;0.57;0.23;3.2;0.073;17;119;0.99675;3.26;0.57;9.3;5 +9.4;0.34;0.37;2.2;0.075;5;13;0.998;3.22;0.62;9.2;5 +6.6;0.695;0;2.1;0.075;12;56;0.9968;3.49;0.67;9.2;5 +7.7;0.41;0.76;1.8;0.611;8;45;0.9968;3.06;1.26;9.4;5 +10;0.31;0.47;2.6;0.085;14;33;0.99965;3.36;0.8;10.5;7 +7.9;0.33;0.23;1.7;0.077;18;45;0.99625;3.29;0.65;9.3;5 +7;0.975;0.04;2;0.087;12;67;0.99565;3.35;0.6;9.4;4 +8;0.52;0.03;1.7;0.07;10;35;0.99575;3.34;0.57;10;5 +7.9;0.37;0.23;1.8;0.077;23;49;0.9963;3.28;0.67;9.3;5 +12.5;0.56;0.49;2.4;0.064;5;27;0.9999;3.08;0.87;10.9;5 +11.8;0.26;0.52;1.8;0.071;6;10;0.9968;3.2;0.72;10.2;7 +8.1;0.87;0;3.3;0.096;26;61;1.00025;3.6;0.72;9.8;4 +7.9;0.35;0.46;3.6;0.078;15;37;0.9973;3.35;0.86;12.8;8 +6.9;0.54;0.04;3;0.077;7;27;0.9987;3.69;0.91;9.4;6 +11.5;0.18;0.51;4;0.104;4;23;0.9996;3.28;0.97;10.1;6 +7.9;0.545;0.06;4;0.087;27;61;0.9965;3.36;0.67;10.7;6 +11.5;0.18;0.51;4;0.104;4;23;0.9996;3.28;0.97;10.1;6 +10.9;0.37;0.58;4;0.071;17;65;0.99935;3.22;0.78;10.1;5 +8.4;0.715;0.2;2.4;0.076;10;38;0.99735;3.31;0.64;9.4;5 +7.5;0.65;0.18;7;0.088;27;94;0.99915;3.38;0.77;9.4;5 +7.9;0.545;0.06;4;0.087;27;61;0.9965;3.36;0.67;10.7;6 +6.9;0.54;0.04;3;0.077;7;27;0.9987;3.69;0.91;9.4;6 +11.5;0.18;0.51;4;0.104;4;23;0.9996;3.28;0.97;10.1;6 +10.3;0.32;0.45;6.4;0.073;5;13;0.9976;3.23;0.82;12.6;8 +8.9;0.4;0.32;5.6;0.087;10;47;0.9991;3.38;0.77;10.5;7 +11.4;0.26;0.44;3.6;0.071;6;19;0.9986;3.12;0.82;9.3;6 +7.7;0.27;0.68;3.5;0.358;5;10;0.9972;3.25;1.08;9.9;7 +7.6;0.52;0.12;3;0.067;12;53;0.9971;3.36;0.57;9.1;5 +8.9;0.4;0.32;5.6;0.087;10;47;0.9991;3.38;0.77;10.5;7 +9.9;0.59;0.07;3.4;0.102;32;71;1.00015;3.31;0.71;9.8;5 +9.9;0.59;0.07;3.4;0.102;32;71;1.00015;3.31;0.71;9.8;5 +12;0.45;0.55;2;0.073;25;49;0.9997;3.1;0.76;10.3;6 +7.5;0.4;0.12;3;0.092;29;53;0.9967;3.37;0.7;10.3;6 +8.7;0.52;0.09;2.5;0.091;20;49;0.9976;3.34;0.86;10.6;7 +11.6;0.42;0.53;3.3;0.105;33;98;1.001;3.2;0.95;9.2;5 +8.7;0.52;0.09;2.5;0.091;20;49;0.9976;3.34;0.86;10.6;7 +11;0.2;0.48;2;0.343;6;18;0.9979;3.3;0.71;10.5;5 +10.4;0.55;0.23;2.7;0.091;18;48;0.9994;3.22;0.64;10.3;6 +6.9;0.36;0.25;2.4;0.098;5;16;0.9964;3.41;0.6;10.1;6 +13.3;0.34;0.52;3.2;0.094;17;53;1.0014;3.05;0.81;9.5;6 +10.8;0.5;0.46;2.5;0.073;5;27;1.0001;3.05;0.64;9.5;5 +10.6;0.83;0.37;2.6;0.086;26;70;0.9981;3.16;0.52;9.9;5 +7.1;0.63;0.06;2;0.083;8;29;0.99855;3.67;0.73;9.6;5 +7.2;0.65;0.02;2.3;0.094;5;31;0.9993;3.67;0.8;9.7;5 +6.9;0.67;0.06;2.1;0.08;8;33;0.99845;3.68;0.71;9.6;5 +7.5;0.53;0.06;2.6;0.086;20;44;0.9965;3.38;0.59;10.7;6 +11.1;0.18;0.48;1.5;0.068;7;15;0.9973;3.22;0.64;10.1;6 +8.3;0.705;0.12;2.6;0.092;12;28;0.9994;3.51;0.72;10;5 +7.4;0.67;0.12;1.6;0.186;5;21;0.996;3.39;0.54;9.5;5 +8.4;0.65;0.6;2.1;0.112;12;90;0.9973;3.2;0.52;9.2;5 +10.3;0.53;0.48;2.5;0.063;6;25;0.9998;3.12;0.59;9.3;6 +7.6;0.62;0.32;2.2;0.082;7;54;0.9966;3.36;0.52;9.4;5 +10.3;0.41;0.42;2.4;0.213;6;14;0.9994;3.19;0.62;9.5;6 +10.3;0.43;0.44;2.4;0.214;5;12;0.9994;3.19;0.63;9.5;6 +7.4;0.29;0.38;1.7;0.062;9;30;0.9968;3.41;0.53;9.5;6 +10.3;0.53;0.48;2.5;0.063;6;25;0.9998;3.12;0.59;9.3;6 +7.9;0.53;0.24;2;0.072;15;105;0.996;3.27;0.54;9.4;6 +9;0.46;0.31;2.8;0.093;19;98;0.99815;3.32;0.63;9.5;6 +8.6;0.47;0.3;3;0.076;30;135;0.9976;3.3;0.53;9.4;5 +7.4;0.36;0.29;2.6;0.087;26;72;0.99645;3.39;0.68;11;5 +7.1;0.35;0.29;2.5;0.096;20;53;0.9962;3.42;0.65;11;6 +9.6;0.56;0.23;3.4;0.102;37;92;0.9996;3.3;0.65;10.1;5 +9.6;0.77;0.12;2.9;0.082;30;74;0.99865;3.3;0.64;10.4;6 +9.8;0.66;0.39;3.2;0.083;21;59;0.9989;3.37;0.71;11.5;7 +9.6;0.77;0.12;2.9;0.082;30;74;0.99865;3.3;0.64;10.4;6 +9.8;0.66;0.39;3.2;0.083;21;59;0.9989;3.37;0.71;11.5;7 +9.3;0.61;0.26;3.4;0.09;25;87;0.99975;3.24;0.62;9.7;5 +7.8;0.62;0.05;2.3;0.079;6;18;0.99735;3.29;0.63;9.3;5 +10.3;0.59;0.42;2.8;0.09;35;73;0.999;3.28;0.7;9.5;6 +10;0.49;0.2;11;0.071;13;50;1.0015;3.16;0.69;9.2;6 +10;0.49;0.2;11;0.071;13;50;1.0015;3.16;0.69;9.2;6 +11.6;0.53;0.66;3.65;0.121;6;14;0.9978;3.05;0.74;11.5;7 +10.3;0.44;0.5;4.5;0.107;5;13;0.998;3.28;0.83;11.5;5 +13.4;0.27;0.62;2.6;0.082;6;21;1.0002;3.16;0.67;9.7;6 +10.7;0.46;0.39;2;0.061;7;15;0.9981;3.18;0.62;9.5;5 +10.2;0.36;0.64;2.9;0.122;10;41;0.998;3.23;0.66;12.5;6 +10.2;0.36;0.64;2.9;0.122;10;41;0.998;3.23;0.66;12.5;6 +8;0.58;0.28;3.2;0.066;21;114;0.9973;3.22;0.54;9.4;6 +8.4;0.56;0.08;2.1;0.105;16;44;0.9958;3.13;0.52;11;5 +7.9;0.65;0.01;2.5;0.078;17;38;0.9963;3.34;0.74;11.7;7 +11.9;0.695;0.53;3.4;0.128;7;21;0.9992;3.17;0.84;12.2;7 +8.9;0.43;0.45;1.9;0.052;6;16;0.9948;3.35;0.7;12.5;6 +7.8;0.43;0.32;2.8;0.08;29;58;0.9974;3.31;0.64;10.3;5 +12.4;0.49;0.58;3;0.103;28;99;1.0008;3.16;1;11.5;6 +12.5;0.28;0.54;2.3;0.082;12;29;0.9997;3.11;1.36;9.8;7 +12.2;0.34;0.5;2.4;0.066;10;21;1;3.12;1.18;9.2;6 +10.6;0.42;0.48;2.7;0.065;5;18;0.9972;3.21;0.87;11.3;6 +10.9;0.39;0.47;1.8;0.118;6;14;0.9982;3.3;0.75;9.8;6 +10.9;0.39;0.47;1.8;0.118;6;14;0.9982;3.3;0.75;9.8;6 +11.9;0.57;0.5;2.6;0.082;6;32;1.0006;3.12;0.78;10.7;6 +7;0.685;0;1.9;0.067;40;63;0.9979;3.6;0.81;9.9;5 +6.6;0.815;0.02;2.7;0.072;17;34;0.9955;3.58;0.89;12.3;7 +13.8;0.49;0.67;3;0.093;6;15;0.9986;3.02;0.93;12;6 +9.6;0.56;0.31;2.8;0.089;15;46;0.9979;3.11;0.92;10;6 +9.1;0.785;0;2.6;0.093;11;28;0.9994;3.36;0.86;9.4;6 +10.7;0.67;0.22;2.7;0.107;17;34;1.0004;3.28;0.98;9.9;6 +9.1;0.795;0;2.6;0.096;11;26;0.9994;3.35;0.83;9.4;6 +7.7;0.665;0;2.4;0.09;8;19;0.9974;3.27;0.73;9.3;5 +13.5;0.53;0.79;4.8;0.12;23;77;1.0018;3.18;0.77;13;5 +6.1;0.21;0.4;1.4;0.066;40.5;165;0.9912;3.25;0.59;11.9;6 +6.7;0.75;0.01;2.4;0.078;17;32;0.9955;3.55;0.61;12.8;6 +11.5;0.41;0.52;3;0.08;29;55;1.0001;3.26;0.88;11;5 +10.5;0.42;0.66;2.95;0.116;12;29;0.997;3.24;0.75;11.7;7 +11.9;0.43;0.66;3.1;0.109;10;23;1;3.15;0.85;10.4;7 +12.6;0.38;0.66;2.6;0.088;10;41;1.001;3.17;0.68;9.8;6 +8.2;0.7;0.23;2;0.099;14;81;0.9973;3.19;0.7;9.4;5 +8.6;0.45;0.31;2.6;0.086;21;50;0.9982;3.37;0.91;9.9;6 +11.9;0.58;0.66;2.5;0.072;6;37;0.9992;3.05;0.56;10;5 +12.5;0.46;0.63;2;0.071;6;15;0.9988;2.99;0.87;10.2;5 +12.8;0.615;0.66;5.8;0.083;7;42;1.0022;3.07;0.73;10;7 +10;0.42;0.5;3.4;0.107;7;21;0.9979;3.26;0.93;11.8;6 +12.8;0.615;0.66;5.8;0.083;7;42;1.0022;3.07;0.73;10;7 +10.4;0.575;0.61;2.6;0.076;11;24;1;3.16;0.69;9;5 +10.3;0.34;0.52;2.8;0.159;15;75;0.9998;3.18;0.64;9.4;5 +9.4;0.27;0.53;2.4;0.074;6;18;0.9962;3.2;1.13;12;7 +6.9;0.765;0.02;2.3;0.063;35;63;0.9975;3.57;0.78;9.9;5 +7.9;0.24;0.4;1.6;0.056;11;25;0.9967;3.32;0.87;8.7;6 +9.1;0.28;0.48;1.8;0.067;26;46;0.9967;3.32;1.04;10.6;6 +7.4;0.55;0.22;2.2;0.106;12;72;0.9959;3.05;0.63;9.2;5 +14;0.41;0.63;3.8;0.089;6;47;1.0014;3.01;0.81;10.8;6 +11.5;0.54;0.71;4.4;0.124;6;15;0.9984;3.01;0.83;11.8;7 +11.5;0.45;0.5;3;0.078;19;47;1.0003;3.26;1.11;11;6 +9.4;0.27;0.53;2.4;0.074;6;18;0.9962;3.2;1.13;12;7 +11.4;0.625;0.66;6.2;0.088;6;24;0.9988;3.11;0.99;13.3;6 +8.3;0.42;0.38;2.5;0.094;24;60;0.9979;3.31;0.7;10.8;6 +8.3;0.26;0.42;2;0.08;11;27;0.9974;3.21;0.8;9.4;6 +13.7;0.415;0.68;2.9;0.085;17;43;1.0014;3.06;0.8;10;6 +8.3;0.26;0.42;2;0.08;11;27;0.9974;3.21;0.8;9.4;6 +8.3;0.26;0.42;2;0.08;11;27;0.9974;3.21;0.8;9.4;6 +7.7;0.51;0.28;2.1;0.087;23;54;0.998;3.42;0.74;9.2;5 +7.4;0.63;0.07;2.4;0.09;11;37;0.9979;3.43;0.76;9.7;6 +7.8;0.54;0.26;2;0.088;23;48;0.9981;3.41;0.74;9.2;6 +8.3;0.66;0.15;1.9;0.079;17;42;0.9972;3.31;0.54;9.6;6 +7.8;0.46;0.26;1.9;0.088;23;53;0.9981;3.43;0.74;9.2;6 +9.6;0.38;0.31;2.5;0.096;16;49;0.9982;3.19;0.7;10;7 +5.6;0.85;0.05;1.4;0.045;12;88;0.9924;3.56;0.82;12.9;8 +13.7;0.415;0.68;2.9;0.085;17;43;1.0014;3.06;0.8;10;6 +9.5;0.37;0.52;2;0.082;6;26;0.998;3.18;0.51;9.5;5 +8.4;0.665;0.61;2;0.112;13;95;0.997;3.16;0.54;9.1;5 +12.7;0.6;0.65;2.3;0.063;6;25;0.9997;3.03;0.57;9.9;5 +12;0.37;0.76;4.2;0.066;7;38;1.0004;3.22;0.6;13;7 +6.6;0.735;0.02;7.9;0.122;68;124;0.9994;3.47;0.53;9.9;5 +11.5;0.59;0.59;2.6;0.087;13;49;0.9988;3.18;0.65;11;6 +11.5;0.59;0.59;2.6;0.087;13;49;0.9988;3.18;0.65;11;6 +8.7;0.765;0.22;2.3;0.064;9;42;0.9963;3.1;0.55;9.4;5 +6.6;0.735;0.02;7.9;0.122;68;124;0.9994;3.47;0.53;9.9;5 +7.7;0.26;0.3;1.7;0.059;20;38;0.9949;3.29;0.47;10.8;6 +12.2;0.48;0.54;2.6;0.085;19;64;1;3.1;0.61;10.5;6 +11.4;0.6;0.49;2.7;0.085;10;41;0.9994;3.15;0.63;10.5;6 +7.7;0.69;0.05;2.7;0.075;15;27;0.9974;3.26;0.61;9.1;5 +8.7;0.31;0.46;1.4;0.059;11;25;0.9966;3.36;0.76;10.1;6 +9.8;0.44;0.47;2.5;0.063;9;28;0.9981;3.24;0.65;10.8;6 +12;0.39;0.66;3;0.093;12;30;0.9996;3.18;0.63;10.8;7 +10.4;0.34;0.58;3.7;0.174;6;16;0.997;3.19;0.7;11.3;6 +12.5;0.46;0.49;4.5;0.07;26;49;0.9981;3.05;0.57;9.6;4 +9;0.43;0.34;2.5;0.08;26;86;0.9987;3.38;0.62;9.5;6 +9.1;0.45;0.35;2.4;0.08;23;78;0.9987;3.38;0.62;9.5;5 +7.1;0.735;0.16;1.9;0.1;15;77;0.9966;3.27;0.64;9.3;5 +9.9;0.4;0.53;6.7;0.097;6;19;0.9986;3.27;0.82;11.7;7 +8.8;0.52;0.34;2.7;0.087;24;122;0.9982;3.26;0.61;9.5;5 +8.6;0.725;0.24;6.6;0.117;31;134;1.0014;3.32;1.07;9.3;5 +10.6;0.48;0.64;2.2;0.111;6;20;0.997;3.26;0.66;11.7;6 +7;0.58;0.12;1.9;0.091;34;124;0.9956;3.44;0.48;10.5;5 +11.9;0.38;0.51;2;0.121;7;20;0.9996;3.24;0.76;10.4;6 +6.8;0.77;0;1.8;0.066;34;52;0.9976;3.62;0.68;9.9;5 +9.5;0.56;0.33;2.4;0.089;35;67;0.9972;3.28;0.73;11.8;7 +6.6;0.84;0.03;2.3;0.059;32;48;0.9952;3.52;0.56;12.3;7 +7.7;0.96;0.2;2;0.047;15;60;0.9955;3.36;0.44;10.9;5 +10.5;0.24;0.47;2.1;0.066;6;24;0.9978;3.15;0.9;11;7 +7.7;0.96;0.2;2;0.047;15;60;0.9955;3.36;0.44;10.9;5 +6.6;0.84;0.03;2.3;0.059;32;48;0.9952;3.52;0.56;12.3;7 +6.4;0.67;0.08;2.1;0.045;19;48;0.9949;3.49;0.49;11.4;6 +9.5;0.78;0.22;1.9;0.077;6;32;0.9988;3.26;0.56;10.6;6 +9.1;0.52;0.33;1.3;0.07;9;30;0.9978;3.24;0.6;9.3;5 +12.8;0.84;0.63;2.4;0.088;13;35;0.9997;3.1;0.6;10.4;6 +10.5;0.24;0.47;2.1;0.066;6;24;0.9978;3.15;0.9;11;7 +7.8;0.55;0.35;2.2;0.074;21;66;0.9974;3.25;0.56;9.2;5 +11.9;0.37;0.69;2.3;0.078;12;24;0.9958;3;0.65;12.8;6 +12.3;0.39;0.63;2.3;0.091;6;18;1.0004;3.16;0.49;9.5;5 +10.4;0.41;0.55;3.2;0.076;22;54;0.9996;3.15;0.89;9.9;6 +12.3;0.39;0.63;2.3;0.091;6;18;1.0004;3.16;0.49;9.5;5 +8;0.67;0.3;2;0.06;38;62;0.9958;3.26;0.56;10.2;6 +11.1;0.45;0.73;3.2;0.066;6;22;0.9986;3.17;0.66;11.2;6 +10.4;0.41;0.55;3.2;0.076;22;54;0.9996;3.15;0.89;9.9;6 +7;0.62;0.18;1.5;0.062;7;50;0.9951;3.08;0.6;9.3;5 +12.6;0.31;0.72;2.2;0.072;6;29;0.9987;2.88;0.82;9.8;8 +11.9;0.4;0.65;2.15;0.068;7;27;0.9988;3.06;0.68;11.3;6 +15.6;0.685;0.76;3.7;0.1;6;43;1.0032;2.95;0.68;11.2;7 +10;0.44;0.49;2.7;0.077;11;19;0.9963;3.23;0.63;11.6;7 +5.3;0.57;0.01;1.7;0.054;5;27;0.9934;3.57;0.84;12.5;7 +9.5;0.735;0.1;2.1;0.079;6;31;0.9986;3.23;0.56;10.1;6 +12.5;0.38;0.6;2.6;0.081;31;72;0.9996;3.1;0.73;10.5;5 +9.3;0.48;0.29;2.1;0.127;6;16;0.9968;3.22;0.72;11.2;5 +8.6;0.53;0.22;2;0.1;7;27;0.9967;3.2;0.56;10.2;6 +11.9;0.39;0.69;2.8;0.095;17;35;0.9994;3.1;0.61;10.8;6 +11.9;0.39;0.69;2.8;0.095;17;35;0.9994;3.1;0.61;10.8;6 +8.4;0.37;0.53;1.8;0.413;9;26;0.9979;3.06;1.06;9.1;6 +6.8;0.56;0.03;1.7;0.084;18;35;0.9968;3.44;0.63;10;6 +10.4;0.33;0.63;2.8;0.084;5;22;0.9998;3.26;0.74;11.2;7 +7;0.23;0.4;1.6;0.063;21;67;0.9952;3.5;0.63;11.1;5 +11.3;0.62;0.67;5.2;0.086;6;19;0.9988;3.22;0.69;13.4;8 +8.9;0.59;0.39;2.3;0.095;5;22;0.9986;3.37;0.58;10.3;5 +9.2;0.63;0.21;2.7;0.097;29;65;0.9988;3.28;0.58;9.6;5 +10.4;0.33;0.63;2.8;0.084;5;22;0.9998;3.26;0.74;11.2;7 +11.6;0.58;0.66;2.2;0.074;10;47;1.0008;3.25;0.57;9;3 +9.2;0.43;0.52;2.3;0.083;14;23;0.9976;3.35;0.61;11.3;6 +8.3;0.615;0.22;2.6;0.087;6;19;0.9982;3.26;0.61;9.3;5 +11;0.26;0.68;2.55;0.085;10;25;0.997;3.18;0.61;11.8;5 +8.1;0.66;0.7;2.2;0.098;25;129;0.9972;3.08;0.53;9;5 +11.5;0.315;0.54;2.1;0.084;5;15;0.9987;2.98;0.7;9.2;6 +10;0.29;0.4;2.9;0.098;10;26;1.0006;3.48;0.91;9.7;5 +10.3;0.5;0.42;2;0.069;21;51;0.9982;3.16;0.72;11.5;6 +8.8;0.46;0.45;2.6;0.065;7;18;0.9947;3.32;0.79;14;6 +11.4;0.36;0.69;2.1;0.09;6;21;1;3.17;0.62;9.2;6 +8.7;0.82;0.02;1.2;0.07;36;48;0.9952;3.2;0.58;9.8;5 +13;0.32;0.65;2.6;0.093;15;47;0.9996;3.05;0.61;10.6;5 +9.6;0.54;0.42;2.4;0.081;25;52;0.997;3.2;0.71;11.4;6 +12.5;0.37;0.55;2.6;0.083;25;68;0.9995;3.15;0.82;10.4;6 +9.9;0.35;0.55;2.1;0.062;5;14;0.9971;3.26;0.79;10.6;5 +10.5;0.28;0.51;1.7;0.08;10;24;0.9982;3.2;0.89;9.4;6 +9.6;0.68;0.24;2.2;0.087;5;28;0.9988;3.14;0.6;10.2;5 +9.3;0.27;0.41;2;0.091;6;16;0.998;3.28;0.7;9.7;5 +10.4;0.24;0.49;1.8;0.075;6;20;0.9977;3.18;1.06;11;6 +9.6;0.68;0.24;2.2;0.087;5;28;0.9988;3.14;0.6;10.2;5 +9.4;0.685;0.11;2.7;0.077;6;31;0.9984;3.19;0.7;10.1;6 +10.6;0.28;0.39;15.5;0.069;6;23;1.0026;3.12;0.66;9.2;5 +9.4;0.3;0.56;2.8;0.08;6;17;0.9964;3.15;0.92;11.7;8 +10.6;0.36;0.59;2.2;0.152;6;18;0.9986;3.04;1.05;9.4;5 +10.6;0.36;0.6;2.2;0.152;7;18;0.9986;3.04;1.06;9.4;5 +10.6;0.44;0.68;4.1;0.114;6;24;0.997;3.06;0.66;13.4;6 +10.2;0.67;0.39;1.9;0.054;6;17;0.9976;3.17;0.47;10;5 +10.2;0.67;0.39;1.9;0.054;6;17;0.9976;3.17;0.47;10;5 +10.2;0.645;0.36;1.8;0.053;5;14;0.9982;3.17;0.42;10;6 +11.6;0.32;0.55;2.8;0.081;35;67;1.0002;3.32;0.92;10.8;7 +9.3;0.39;0.4;2.6;0.073;10;26;0.9984;3.34;0.75;10.2;6 +9.3;0.775;0.27;2.8;0.078;24;56;0.9984;3.31;0.67;10.6;6 +9.2;0.41;0.5;2.5;0.055;12;25;0.9952;3.34;0.79;13.3;7 +8.9;0.4;0.51;2.6;0.052;13;27;0.995;3.32;0.9;13.4;7 +8.7;0.69;0.31;3;0.086;23;81;1.0002;3.48;0.74;11.6;6 +6.5;0.39;0.23;8.3;0.051;28;91;0.9952;3.44;0.55;12.1;6 +10.7;0.35;0.53;2.6;0.07;5;16;0.9972;3.15;0.65;11;8 +7.8;0.52;0.25;1.9;0.081;14;38;0.9984;3.43;0.65;9;6 +7.2;0.34;0.32;2.5;0.09;43;113;0.9966;3.32;0.79;11.1;5 +10.7;0.35;0.53;2.6;0.07;5;16;0.9972;3.15;0.65;11;8 +8.7;0.69;0.31;3;0.086;23;81;1.0002;3.48;0.74;11.6;6 +7.8;0.52;0.25;1.9;0.081;14;38;0.9984;3.43;0.65;9;6 +10.4;0.44;0.73;6.55;0.074;38;76;0.999;3.17;0.85;12;7 +10.4;0.44;0.73;6.55;0.074;38;76;0.999;3.17;0.85;12;7 +10.5;0.26;0.47;1.9;0.078;6;24;0.9976;3.18;1.04;10.9;7 +10.5;0.24;0.42;1.8;0.077;6;22;0.9976;3.21;1.05;10.8;7 +10.2;0.49;0.63;2.9;0.072;10;26;0.9968;3.16;0.78;12.5;7 +10.4;0.24;0.46;1.8;0.075;6;21;0.9976;3.25;1.02;10.8;7 +11.2;0.67;0.55;2.3;0.084;6;13;1;3.17;0.71;9.5;6 +10;0.59;0.31;2.2;0.09;26;62;0.9994;3.18;0.63;10.2;6 +13.3;0.29;0.75;2.8;0.084;23;43;0.9986;3.04;0.68;11.4;7 +12.4;0.42;0.49;4.6;0.073;19;43;0.9978;3.02;0.61;9.5;5 +10;0.59;0.31;2.2;0.09;26;62;0.9994;3.18;0.63;10.2;6 +10.7;0.4;0.48;2.1;0.125;15;49;0.998;3.03;0.81;9.7;6 +10.5;0.51;0.64;2.4;0.107;6;15;0.9973;3.09;0.66;11.8;7 +10.5;0.51;0.64;2.4;0.107;6;15;0.9973;3.09;0.66;11.8;7 +8.5;0.655;0.49;6.1;0.122;34;151;1.001;3.31;1.14;9.3;5 +12.5;0.6;0.49;4.3;0.1;5;14;1.001;3.25;0.74;11.9;6 +10.4;0.61;0.49;2.1;0.2;5;16;0.9994;3.16;0.63;8.4;3 +10.9;0.21;0.49;2.8;0.088;11;32;0.9972;3.22;0.68;11.7;6 +7.3;0.365;0.49;2.5;0.088;39;106;0.9966;3.36;0.78;11;5 +9.8;0.25;0.49;2.7;0.088;15;33;0.9982;3.42;0.9;10;6 +7.6;0.41;0.49;2;0.088;16;43;0.998;3.48;0.64;9.1;5 +8.2;0.39;0.49;2.3;0.099;47;133;0.9979;3.38;0.99;9.8;5 +9.3;0.4;0.49;2.5;0.085;38;142;0.9978;3.22;0.55;9.4;5 +9.2;0.43;0.49;2.4;0.086;23;116;0.9976;3.23;0.64;9.5;5 +10.4;0.64;0.24;2.8;0.105;29;53;0.9998;3.24;0.67;9.9;5 +7.3;0.365;0.49;2.5;0.088;39;106;0.9966;3.36;0.78;11;5 +7;0.38;0.49;2.5;0.097;33;85;0.9962;3.39;0.77;11.4;6 +8.2;0.42;0.49;2.6;0.084;32;55;0.9988;3.34;0.75;8.7;6 +9.9;0.63;0.24;2.4;0.077;6;33;0.9974;3.09;0.57;9.4;5 +9.1;0.22;0.24;2.1;0.078;1;28;0.999;3.41;0.87;10.3;6 +11.9;0.38;0.49;2.7;0.098;12;42;1.0004;3.16;0.61;10.3;5 +11.9;0.38;0.49;2.7;0.098;12;42;1.0004;3.16;0.61;10.3;5 +10.3;0.27;0.24;2.1;0.072;15;33;0.9956;3.22;0.66;12.8;6 +10;0.48;0.24;2.7;0.102;13;32;1;3.28;0.56;10;6 +9.1;0.22;0.24;2.1;0.078;1;28;0.999;3.41;0.87;10.3;6 +9.9;0.63;0.24;2.4;0.077;6;33;0.9974;3.09;0.57;9.4;5 +8.1;0.825;0.24;2.1;0.084;5;13;0.9972;3.37;0.77;10.7;6 +12.9;0.35;0.49;5.8;0.066;5;35;1.0014;3.2;0.66;12;7 +11.2;0.5;0.74;5.15;0.1;5;17;0.9996;3.22;0.62;11.2;5 +9.2;0.59;0.24;3.3;0.101;20;47;0.9988;3.26;0.67;9.6;5 +9.5;0.46;0.49;6.3;0.064;5;17;0.9988;3.21;0.73;11;6 +9.3;0.715;0.24;2.1;0.07;5;20;0.9966;3.12;0.59;9.9;5 +11.2;0.66;0.24;2.5;0.085;16;53;0.9993;3.06;0.72;11;6 +14.3;0.31;0.74;1.8;0.075;6;15;1.0008;2.86;0.79;8.4;6 +9.1;0.47;0.49;2.6;0.094;38;106;0.9982;3.08;0.59;9.1;5 +7.5;0.55;0.24;2;0.078;10;28;0.9983;3.45;0.78;9.5;6 +10.6;0.31;0.49;2.5;0.067;6;21;0.9987;3.26;0.86;10.7;6 +12.4;0.35;0.49;2.6;0.079;27;69;0.9994;3.12;0.75;10.4;6 +9;0.53;0.49;1.9;0.171;6;25;0.9975;3.27;0.61;9.4;6 +6.8;0.51;0.01;2.1;0.074;9;25;0.9958;3.33;0.56;9.5;6 +9.4;0.43;0.24;2.8;0.092;14;45;0.998;3.19;0.73;10;6 +9.5;0.46;0.24;2.7;0.092;14;44;0.998;3.12;0.74;10;6 +5;1.04;0.24;1.6;0.05;32;96;0.9934;3.74;0.62;11.5;5 +15.5;0.645;0.49;4.2;0.095;10;23;1.00315;2.92;0.74;11.1;5 +15.5;0.645;0.49;4.2;0.095;10;23;1.00315;2.92;0.74;11.1;5 +10.9;0.53;0.49;4.6;0.118;10;17;1.0002;3.07;0.56;11.7;6 +15.6;0.645;0.49;4.2;0.095;10;23;1.00315;2.92;0.74;11.1;5 +10.9;0.53;0.49;4.6;0.118;10;17;1.0002;3.07;0.56;11.7;6 +13;0.47;0.49;4.3;0.085;6;47;1.0021;3.3;0.68;12.7;6 +12.7;0.6;0.49;2.8;0.075;5;19;0.9994;3.14;0.57;11.4;5 +9;0.44;0.49;2.4;0.078;26;121;0.9978;3.23;0.58;9.2;5 +9;0.54;0.49;2.9;0.094;41;110;0.9982;3.08;0.61;9.2;5 +7.6;0.29;0.49;2.7;0.092;25;60;0.9971;3.31;0.61;10.1;6 +13;0.47;0.49;4.3;0.085;6;47;1.0021;3.3;0.68;12.7;6 +12.7;0.6;0.49;2.8;0.075;5;19;0.9994;3.14;0.57;11.4;5 +8.7;0.7;0.24;2.5;0.226;5;15;0.9991;3.32;0.6;9;6 +8.7;0.7;0.24;2.5;0.226;5;15;0.9991;3.32;0.6;9;6 +9.8;0.5;0.49;2.6;0.25;5;20;0.999;3.31;0.79;10.7;6 +6.2;0.36;0.24;2.2;0.095;19;42;0.9946;3.57;0.57;11.7;6 +11.5;0.35;0.49;3.3;0.07;10;37;1.0003;3.32;0.91;11;6 +6.2;0.36;0.24;2.2;0.095;19;42;0.9946;3.57;0.57;11.7;6 +10.2;0.24;0.49;2.4;0.075;10;28;0.9978;3.14;0.61;10.4;5 +10.5;0.59;0.49;2.1;0.07;14;47;0.9991;3.3;0.56;9.6;4 +10.6;0.34;0.49;3.2;0.078;20;78;0.9992;3.19;0.7;10;6 +12.3;0.27;0.49;3.1;0.079;28;46;0.9993;3.2;0.8;10.2;6 +9.9;0.5;0.24;2.3;0.103;6;14;0.9978;3.34;0.52;10;4 +8.8;0.44;0.49;2.8;0.083;18;111;0.9982;3.3;0.6;9.5;5 +8.8;0.47;0.49;2.9;0.085;17;110;0.9982;3.29;0.6;9.8;5 +10.6;0.31;0.49;2.2;0.063;18;40;0.9976;3.14;0.51;9.8;6 +12.3;0.5;0.49;2.2;0.089;5;14;1.0002;3.19;0.44;9.6;5 +12.3;0.5;0.49;2.2;0.089;5;14;1.0002;3.19;0.44;9.6;5 +11.7;0.49;0.49;2.2;0.083;5;15;1;3.19;0.43;9.2;5 +12;0.28;0.49;1.9;0.074;10;21;0.9976;2.98;0.66;9.9;7 +11.8;0.33;0.49;3.4;0.093;54;80;1.0002;3.3;0.76;10.7;7 +7.6;0.51;0.24;2.4;0.091;8;38;0.998;3.47;0.66;9.6;6 +11.1;0.31;0.49;2.7;0.094;16;47;0.9986;3.12;1.02;10.6;7 +7.3;0.73;0.24;1.9;0.108;18;102;0.9967;3.26;0.59;9.3;5 +5;0.42;0.24;2;0.06;19;50;0.9917;3.72;0.74;14;8 +10.2;0.29;0.49;2.6;0.059;5;13;0.9976;3.05;0.74;10.5;7 +9;0.45;0.49;2.6;0.084;21;75;0.9987;3.35;0.57;9.7;5 +6.6;0.39;0.49;1.7;0.07;23;149;0.9922;3.12;0.5;11.5;6 +9;0.45;0.49;2.6;0.084;21;75;0.9987;3.35;0.57;9.7;5 +9.9;0.49;0.58;3.5;0.094;9;43;1.0004;3.29;0.58;9;5 +7.9;0.72;0.17;2.6;0.096;20;38;0.9978;3.4;0.53;9.5;5 +8.9;0.595;0.41;7.9;0.086;30;109;0.9998;3.27;0.57;9.3;5 +12.4;0.4;0.51;2;0.059;6;24;0.9994;3.04;0.6;9.3;6 +11.9;0.58;0.58;1.9;0.071;5;18;0.998;3.09;0.63;10;6 +8.5;0.585;0.18;2.1;0.078;5;30;0.9967;3.2;0.48;9.8;6 +12.7;0.59;0.45;2.3;0.082;11;22;1;3;0.7;9.3;6 +8.2;0.915;0.27;2.1;0.088;7;23;0.9962;3.26;0.47;10;4 +13.2;0.46;0.52;2.2;0.071;12;35;1.0006;3.1;0.56;9;6 +7.7;0.835;0;2.6;0.081;6;14;0.9975;3.3;0.52;9.3;5 +13.2;0.46;0.52;2.2;0.071;12;35;1.0006;3.1;0.56;9;6 +8.3;0.58;0.13;2.9;0.096;14;63;0.9984;3.17;0.62;9.1;6 +8.3;0.6;0.13;2.6;0.085;6;24;0.9984;3.31;0.59;9.2;6 +9.4;0.41;0.48;4.6;0.072;10;20;0.9973;3.34;0.79;12.2;7 +8.8;0.48;0.41;3.3;0.092;26;52;0.9982;3.31;0.53;10.5;6 +10.1;0.65;0.37;5.1;0.11;11;65;1.0026;3.32;0.64;10.4;6 +6.3;0.36;0.19;3.2;0.075;15;39;0.9956;3.56;0.52;12.7;6 +8.8;0.24;0.54;2.5;0.083;25;57;0.9983;3.39;0.54;9.2;5 +13.2;0.38;0.55;2.7;0.081;5;16;1.0006;2.98;0.54;9.4;5 +7.5;0.64;0;2.4;0.077;18;29;0.9965;3.32;0.6;10;6 +8.2;0.39;0.38;1.5;0.058;10;29;0.9962;3.26;0.74;9.8;5 +9.2;0.755;0.18;2.2;0.148;10;103;0.9969;2.87;1.36;10.2;6 +9.6;0.6;0.5;2.3;0.079;28;71;0.9997;3.5;0.57;9.7;5 +9.6;0.6;0.5;2.3;0.079;28;71;0.9997;3.5;0.57;9.7;5 +11.5;0.31;0.51;2.2;0.079;14;28;0.9982;3.03;0.93;9.8;6 +11.4;0.46;0.5;2.7;0.122;4;17;1.0006;3.13;0.7;10.2;5 +11.3;0.37;0.41;2.3;0.088;6;16;0.9988;3.09;0.8;9.3;5 +8.3;0.54;0.24;3.4;0.076;16;112;0.9976;3.27;0.61;9.4;5 +8.2;0.56;0.23;3.4;0.078;14;104;0.9976;3.28;0.62;9.4;5 +10;0.58;0.22;1.9;0.08;9;32;0.9974;3.13;0.55;9.5;5 +7.9;0.51;0.25;2.9;0.077;21;45;0.9974;3.49;0.96;12.1;6 +6.8;0.69;0;5.6;0.124;21;58;0.9997;3.46;0.72;10.2;5 +6.8;0.69;0;5.6;0.124;21;58;0.9997;3.46;0.72;10.2;5 +8.8;0.6;0.29;2.2;0.098;5;15;0.9988;3.36;0.49;9.1;5 +8.8;0.6;0.29;2.2;0.098;5;15;0.9988;3.36;0.49;9.1;5 +8.7;0.54;0.26;2.5;0.097;7;31;0.9976;3.27;0.6;9.3;6 +7.6;0.685;0.23;2.3;0.111;20;84;0.9964;3.21;0.61;9.3;5 +8.7;0.54;0.26;2.5;0.097;7;31;0.9976;3.27;0.6;9.3;6 +10.4;0.28;0.54;2.7;0.105;5;19;0.9988;3.25;0.63;9.5;5 +7.6;0.41;0.14;3;0.087;21;43;0.9964;3.32;0.57;10.5;6 +10.1;0.935;0.22;3.4;0.105;11;86;1.001;3.43;0.64;11.3;4 +7.9;0.35;0.21;1.9;0.073;46;102;0.9964;3.27;0.58;9.5;5 +8.7;0.84;0;1.4;0.065;24;33;0.9954;3.27;0.55;9.7;5 +9.6;0.88;0.28;2.4;0.086;30;147;0.9979;3.24;0.53;9.4;5 +9.5;0.885;0.27;2.3;0.084;31;145;0.9978;3.24;0.53;9.4;5 +7.7;0.915;0.12;2.2;0.143;7;23;0.9964;3.35;0.65;10.2;7 +8.9;0.29;0.35;1.9;0.067;25;57;0.997;3.18;1.36;10.3;6 +9.9;0.54;0.45;2.3;0.071;16;40;0.9991;3.39;0.62;9.4;5 +9.5;0.59;0.44;2.3;0.071;21;68;0.9992;3.46;0.63;9.5;5 +9.9;0.54;0.45;2.3;0.071;16;40;0.9991;3.39;0.62;9.4;5 +9.5;0.59;0.44;2.3;0.071;21;68;0.9992;3.46;0.63;9.5;5 +9.9;0.54;0.45;2.3;0.071;16;40;0.9991;3.39;0.62;9.4;5 +7.8;0.64;0.1;6;0.115;5;11;0.9984;3.37;0.69;10.1;7 +7.3;0.67;0.05;3.6;0.107;6;20;0.9972;3.4;0.63;10.1;5 +8.3;0.845;0.01;2.2;0.07;5;14;0.9967;3.32;0.58;11;4 +8.7;0.48;0.3;2.8;0.066;10;28;0.9964;3.33;0.67;11.2;7 +6.7;0.42;0.27;8.6;0.068;24;148;0.9948;3.16;0.57;11.3;6 +10.7;0.43;0.39;2.2;0.106;8;32;0.9986;2.89;0.5;9.6;5 +9.8;0.88;0.25;2.5;0.104;35;155;1.001;3.41;0.67;11.2;5 +15.9;0.36;0.65;7.5;0.096;22;71;0.9976;2.98;0.84;14.9;5 +9.4;0.33;0.59;2.8;0.079;9;30;0.9976;3.12;0.54;12;6 +8.6;0.47;0.47;2.4;0.074;7;29;0.9979;3.08;0.46;9.5;5 +9.7;0.55;0.17;2.9;0.087;20;53;1.0004;3.14;0.61;9.4;5 +10.7;0.43;0.39;2.2;0.106;8;32;0.9986;2.89;0.5;9.6;5 +12;0.5;0.59;1.4;0.073;23;42;0.998;2.92;0.68;10.5;7 +7.2;0.52;0.07;1.4;0.074;5;20;0.9973;3.32;0.81;9.6;6 +7.1;0.84;0.02;4.4;0.096;5;13;0.997;3.41;0.57;11;4 +7.2;0.52;0.07;1.4;0.074;5;20;0.9973;3.32;0.81;9.6;6 +7.5;0.42;0.31;1.6;0.08;15;42;0.9978;3.31;0.64;9;5 +7.2;0.57;0.06;1.6;0.076;9;27;0.9972;3.36;0.7;9.6;6 +10.1;0.28;0.46;1.8;0.05;5;13;0.9974;3.04;0.79;10.2;6 +12.1;0.4;0.52;2;0.092;15;54;1;3.03;0.66;10.2;5 +9.4;0.59;0.14;2;0.084;25;48;0.9981;3.14;0.56;9.7;5 +8.3;0.49;0.36;1.8;0.222;6;16;0.998;3.18;0.6;9.5;6 +11.3;0.34;0.45;2;0.082;6;15;0.9988;2.94;0.66;9.2;6 +10;0.73;0.43;2.3;0.059;15;31;0.9966;3.15;0.57;11;5 +11.3;0.34;0.45;2;0.082;6;15;0.9988;2.94;0.66;9.2;6 +6.9;0.4;0.24;2.5;0.083;30;45;0.9959;3.26;0.58;10;5 +8.2;0.73;0.21;1.7;0.074;5;13;0.9968;3.2;0.52;9.5;5 +9.8;1.24;0.34;2;0.079;32;151;0.998;3.15;0.53;9.5;5 +8.2;0.73;0.21;1.7;0.074;5;13;0.9968;3.2;0.52;9.5;5 +10.8;0.4;0.41;2.2;0.084;7;17;0.9984;3.08;0.67;9.3;6 +9.3;0.41;0.39;2.2;0.064;12;31;0.9984;3.26;0.65;10.2;5 +10.8;0.4;0.41;2.2;0.084;7;17;0.9984;3.08;0.67;9.3;6 +8.6;0.8;0.11;2.3;0.084;12;31;0.9979;3.4;0.48;9.9;5 +8.3;0.78;0.1;2.6;0.081;45;87;0.9983;3.48;0.53;10;5 +10.8;0.26;0.45;3.3;0.06;20;49;0.9972;3.13;0.54;9.6;5 +13.3;0.43;0.58;1.9;0.07;15;40;1.0004;3.06;0.49;9;5 +8;0.45;0.23;2.2;0.094;16;29;0.9962;3.21;0.49;10.2;6 +8.5;0.46;0.31;2.25;0.078;32;58;0.998;3.33;0.54;9.8;5 +8.1;0.78;0.23;2.6;0.059;5;15;0.997;3.37;0.56;11.3;5 +9.8;0.98;0.32;2.3;0.078;35;152;0.998;3.25;0.48;9.4;5 +8.1;0.78;0.23;2.6;0.059;5;15;0.997;3.37;0.56;11.3;5 +7.1;0.65;0.18;1.8;0.07;13;40;0.997;3.44;0.6;9.1;5 +9.1;0.64;0.23;3.1;0.095;13;38;0.9998;3.28;0.59;9.7;5 +7.7;0.66;0.04;1.6;0.039;4;9;0.9962;3.4;0.47;9.4;5 +8.1;0.38;0.48;1.8;0.157;5;17;0.9976;3.3;1.05;9.4;5 +7.4;1.185;0;4.25;0.097;5;14;0.9966;3.63;0.54;10.7;3 +9.2;0.92;0.24;2.6;0.087;12;93;0.9998;3.48;0.54;9.8;5 +8.6;0.49;0.51;2;0.422;16;62;0.9979;3.03;1.17;9;5 +9;0.48;0.32;2.8;0.084;21;122;0.9984;3.32;0.62;9.4;5 +9;0.47;0.31;2.7;0.084;24;125;0.9984;3.31;0.61;9.4;5 +5.1;0.47;0.02;1.3;0.034;18;44;0.9921;3.9;0.62;12.8;6 +7;0.65;0.02;2.1;0.066;8;25;0.9972;3.47;0.67;9.5;6 +7;0.65;0.02;2.1;0.066;8;25;0.9972;3.47;0.67;9.5;6 +9.4;0.615;0.28;3.2;0.087;18;72;1.0001;3.31;0.53;9.7;5 +11.8;0.38;0.55;2.1;0.071;5;19;0.9986;3.11;0.62;10.8;6 +10.6;1.02;0.43;2.9;0.076;26;88;0.9984;3.08;0.57;10.1;6 +7;0.65;0.02;2.1;0.066;8;25;0.9972;3.47;0.67;9.5;6 +7;0.64;0.02;2.1;0.067;9;23;0.997;3.47;0.67;9.4;6 +7.5;0.38;0.48;2.6;0.073;22;84;0.9972;3.32;0.7;9.6;4 +9.1;0.765;0.04;1.6;0.078;4;14;0.998;3.29;0.54;9.7;4 +8.4;1.035;0.15;6;0.073;11;54;0.999;3.37;0.49;9.9;5 +7;0.78;0.08;2;0.093;10;19;0.9956;3.4;0.47;10;5 +7.4;0.49;0.19;3;0.077;16;37;0.9966;3.37;0.51;10.5;5 +7.8;0.545;0.12;2.5;0.068;11;35;0.996;3.34;0.61;11.6;6 +9.7;0.31;0.47;1.6;0.062;13;33;0.9983;3.27;0.66;10;6 +10.6;1.025;0.43;2.8;0.08;21;84;0.9985;3.06;0.57;10.1;5 +8.9;0.565;0.34;3;0.093;16;112;0.9998;3.38;0.61;9.5;5 +8.7;0.69;0;3.2;0.084;13;33;0.9992;3.36;0.45;9.4;5 +8;0.43;0.36;2.3;0.075;10;48;0.9976;3.34;0.46;9.4;5 +9.9;0.74;0.28;2.6;0.078;21;77;0.998;3.28;0.51;9.8;5 +7.2;0.49;0.18;2.7;0.069;13;34;0.9967;3.29;0.48;9.2;6 +8;0.43;0.36;2.3;0.075;10;48;0.9976;3.34;0.46;9.4;5 +7.6;0.46;0.11;2.6;0.079;12;49;0.9968;3.21;0.57;10;5 +8.4;0.56;0.04;2;0.082;10;22;0.9976;3.22;0.44;9.6;5 +7.1;0.66;0;3.9;0.086;17;45;0.9976;3.46;0.54;9.5;5 +8.4;0.56;0.04;2;0.082;10;22;0.9976;3.22;0.44;9.6;5 +8.9;0.48;0.24;2.85;0.094;35;106;0.9982;3.1;0.53;9.2;5 +7.6;0.42;0.08;2.7;0.084;15;48;0.9968;3.21;0.59;10;5 +7.1;0.31;0.3;2.2;0.053;36;127;0.9965;2.94;1.62;9.5;5 +7.5;1.115;0.1;3.1;0.086;5;12;0.9958;3.54;0.6;11.2;4 +9;0.66;0.17;3;0.077;5;13;0.9976;3.29;0.55;10.4;5 +8.1;0.72;0.09;2.8;0.084;18;49;0.9994;3.43;0.72;11.1;6 +6.4;0.57;0.02;1.8;0.067;4;11;0.997;3.46;0.68;9.5;5 +6.4;0.57;0.02;1.8;0.067;4;11;0.997;3.46;0.68;9.5;5 +6.4;0.865;0.03;3.2;0.071;27;58;0.995;3.61;0.49;12.7;6 +9.5;0.55;0.66;2.3;0.387;12;37;0.9982;3.17;0.67;9.6;5 +8.9;0.875;0.13;3.45;0.088;4;14;0.9994;3.44;0.52;11.5;5 +7.3;0.835;0.03;2.1;0.092;10;19;0.9966;3.39;0.47;9.6;5 +7;0.45;0.34;2.7;0.082;16;72;0.998;3.55;0.6;9.5;5 +7.7;0.56;0.2;2;0.075;9;39;0.9987;3.48;0.62;9.3;5 +7.7;0.965;0.1;2.1;0.112;11;22;0.9963;3.26;0.5;9.5;5 +7.7;0.965;0.1;2.1;0.112;11;22;0.9963;3.26;0.5;9.5;5 +8.2;0.59;0;2.5;0.093;19;58;1.0002;3.5;0.65;9.3;6 +9;0.46;0.23;2.8;0.092;28;104;0.9983;3.1;0.56;9.2;5 +9;0.69;0;2.4;0.088;19;38;0.999;3.35;0.6;9.3;5 +8.3;0.76;0.29;4.2;0.075;12;16;0.9965;3.45;0.68;11.5;6 +9.2;0.53;0.24;2.6;0.078;28;139;0.99788;3.21;0.57;9.5;5 +6.5;0.615;0;1.9;0.065;9;18;0.9972;3.46;0.65;9.2;5 +11.6;0.41;0.58;2.8;0.096;25;101;1.00024;3.13;0.53;10;5 +11.1;0.39;0.54;2.7;0.095;21;101;1.0001;3.13;0.51;9.5;5 +7.3;0.51;0.18;2.1;0.07;12;28;0.99768;3.52;0.73;9.5;6 +8.2;0.34;0.38;2.5;0.08;12;57;0.9978;3.3;0.47;9;6 +8.6;0.33;0.4;2.6;0.083;16;68;0.99782;3.3;0.48;9.4;5 +7.2;0.5;0.18;2.1;0.071;12;31;0.99761;3.52;0.72;9.6;6 +7.3;0.51;0.18;2.1;0.07;12;28;0.99768;3.52;0.73;9.5;6 +8.3;0.65;0.1;2.9;0.089;17;40;0.99803;3.29;0.55;9.5;5 +8.3;0.65;0.1;2.9;0.089;17;40;0.99803;3.29;0.55;9.5;5 +7.6;0.54;0.13;2.5;0.097;24;66;0.99785;3.39;0.61;9.4;5 +8.3;0.65;0.1;2.9;0.089;17;40;0.99803;3.29;0.55;9.5;5 +7.8;0.48;0.68;1.7;0.415;14;32;0.99656;3.09;1.06;9.1;6 +7.8;0.91;0.07;1.9;0.058;22;47;0.99525;3.51;0.43;10.7;6 +6.3;0.98;0.01;2;0.057;15;33;0.99488;3.6;0.46;11.2;6 +8.1;0.87;0;2.2;0.084;10;31;0.99656;3.25;0.5;9.8;5 +8.1;0.87;0;2.2;0.084;10;31;0.99656;3.25;0.5;9.8;5 +8.8;0.42;0.21;2.5;0.092;33;88;0.99823;3.19;0.52;9.2;5 +9;0.58;0.25;2.8;0.075;9;104;0.99779;3.23;0.57;9.7;5 +9.3;0.655;0.26;2;0.096;5;35;0.99738;3.25;0.42;9.6;5 +8.8;0.7;0;1.7;0.069;8;19;0.99701;3.31;0.53;10;6 +9.3;0.655;0.26;2;0.096;5;35;0.99738;3.25;0.42;9.6;5 +9.1;0.68;0.11;2.8;0.093;11;44;0.99888;3.31;0.55;9.5;6 +9.2;0.67;0.1;3;0.091;12;48;0.99888;3.31;0.54;9.5;6 +8.8;0.59;0.18;2.9;0.089;12;74;0.99738;3.14;0.54;9.4;5 +7.5;0.6;0.32;2.7;0.103;13;98;0.99938;3.45;0.62;9.5;5 +7.1;0.59;0.02;2.3;0.082;24;94;0.99744;3.55;0.53;9.7;6 +7.9;0.72;0.01;1.9;0.076;7;32;0.99668;3.39;0.54;9.6;5 +7.1;0.59;0.02;2.3;0.082;24;94;0.99744;3.55;0.53;9.7;6 +9.4;0.685;0.26;2.4;0.082;23;143;0.9978;3.28;0.55;9.4;5 +9.5;0.57;0.27;2.3;0.082;23;144;0.99782;3.27;0.55;9.4;5 +7.9;0.4;0.29;1.8;0.157;1;44;0.9973;3.3;0.92;9.5;6 +7.9;0.4;0.3;1.8;0.157;2;45;0.99727;3.31;0.91;9.5;6 +7.2;1;0;3;0.102;7;16;0.99586;3.43;0.46;10;5 +6.9;0.765;0.18;2.4;0.243;5.5;48;0.99612;3.4;0.6;10.3;6 +6.9;0.635;0.17;2.4;0.241;6;18;0.9961;3.4;0.59;10.3;6 +8.3;0.43;0.3;3.4;0.079;7;34;0.99788;3.36;0.61;10.5;5 +7.1;0.52;0.03;2.6;0.076;21;92;0.99745;3.5;0.6;9.8;5 +7;0.57;0;2;0.19;12;45;0.99676;3.31;0.6;9.4;6 +6.5;0.46;0.14;2.4;0.114;9;37;0.99732;3.66;0.65;9.8;5 +9;0.82;0.05;2.4;0.081;26;96;0.99814;3.36;0.53;10;5 +6.5;0.46;0.14;2.4;0.114;9;37;0.99732;3.66;0.65;9.8;5 +7.1;0.59;0.01;2.5;0.077;20;85;0.99746;3.55;0.59;9.8;5 +9.9;0.35;0.41;2.3;0.083;11;61;0.9982;3.21;0.5;9.5;5 +9.9;0.35;0.41;2.3;0.083;11;61;0.9982;3.21;0.5;9.5;5 +10;0.56;0.24;2.2;0.079;19;58;0.9991;3.18;0.56;10.1;6 +10;0.56;0.24;2.2;0.079;19;58;0.9991;3.18;0.56;10.1;6 +8.6;0.63;0.17;2.9;0.099;21;119;0.998;3.09;0.52;9.3;5 +7.4;0.37;0.43;2.6;0.082;18;82;0.99708;3.33;0.68;9.7;6 +8.8;0.64;0.17;2.9;0.084;25;130;0.99818;3.23;0.54;9.6;5 +7.1;0.61;0.02;2.5;0.081;17;87;0.99745;3.48;0.6;9.7;6 +7.7;0.6;0;2.6;0.055;7;13;0.99639;3.38;0.56;10.8;5 +10.1;0.27;0.54;2.3;0.065;7;26;0.99531;3.17;0.53;12.5;6 +10.8;0.89;0.3;2.6;0.132;7;60;0.99786;2.99;1.18;10.2;5 +8.7;0.46;0.31;2.5;0.126;24;64;0.99746;3.1;0.74;9.6;5 +9.3;0.37;0.44;1.6;0.038;21;42;0.99526;3.24;0.81;10.8;7 +9.4;0.5;0.34;3.6;0.082;5;14;0.9987;3.29;0.52;10.7;6 +9.4;0.5;0.34;3.6;0.082;5;14;0.9987;3.29;0.52;10.7;6 +7.2;0.61;0.08;4;0.082;26;108;0.99641;3.25;0.51;9.4;5 +8.6;0.55;0.09;3.3;0.068;8;17;0.99735;3.23;0.44;10;5 +5.1;0.585;0;1.7;0.044;14;86;0.99264;3.56;0.94;12.9;7 +7.7;0.56;0.08;2.5;0.114;14;46;0.9971;3.24;0.66;9.6;6 +8.4;0.52;0.22;2.7;0.084;4;18;0.99682;3.26;0.57;9.9;6 +8.2;0.28;0.4;2.4;0.052;4;10;0.99356;3.33;0.7;12.8;7 +8.4;0.25;0.39;2;0.041;4;10;0.99386;3.27;0.71;12.5;7 +8.2;0.28;0.4;2.4;0.052;4;10;0.99356;3.33;0.7;12.8;7 +7.4;0.53;0.12;1.9;0.165;4;12;0.99702;3.26;0.86;9.2;5 +7.6;0.48;0.31;2.8;0.07;4;15;0.99693;3.22;0.55;10.3;6 +7.3;0.49;0.1;2.6;0.068;4;14;0.99562;3.3;0.47;10.5;5 +12.9;0.5;0.55;2.8;0.072;7;24;1.00012;3.09;0.68;10.9;6 +10.8;0.45;0.33;2.5;0.099;20;38;0.99818;3.24;0.71;10.8;5 +6.9;0.39;0.24;2.1;0.102;4;7;0.99462;3.44;0.58;11.4;4 +12.6;0.41;0.54;2.8;0.103;19;41;0.99939;3.21;0.76;11.3;6 +10.8;0.45;0.33;2.5;0.099;20;38;0.99818;3.24;0.71;10.8;5 +9.8;0.51;0.19;3.2;0.081;8;30;0.9984;3.23;0.58;10.5;6 +10.8;0.29;0.42;1.6;0.084;19;27;0.99545;3.28;0.73;11.9;6 +7.1;0.715;0;2.35;0.071;21;47;0.99632;3.29;0.45;9.4;5 +9.1;0.66;0.15;3.2;0.097;9;59;0.99976;3.28;0.54;9.6;5 +7;0.685;0;1.9;0.099;9;22;0.99606;3.34;0.6;9.7;5 +4.9;0.42;0;2.1;0.048;16;42;0.99154;3.71;0.74;14;7 +6.7;0.54;0.13;2;0.076;15;36;0.9973;3.61;0.64;9.8;5 +6.7;0.54;0.13;2;0.076;15;36;0.9973;3.61;0.64;9.8;5 +7.1;0.48;0.28;2.8;0.068;6;16;0.99682;3.24;0.53;10.3;5 +7.1;0.46;0.14;2.8;0.076;15;37;0.99624;3.36;0.49;10.7;5 +7.5;0.27;0.34;2.3;0.05;4;8;0.9951;3.4;0.64;11;7 +7.1;0.46;0.14;2.8;0.076;15;37;0.99624;3.36;0.49;10.7;5 +7.8;0.57;0.09;2.3;0.065;34;45;0.99417;3.46;0.74;12.7;8 +5.9;0.61;0.08;2.1;0.071;16;24;0.99376;3.56;0.77;11.1;6 +7.5;0.685;0.07;2.5;0.058;5;9;0.99632;3.38;0.55;10.9;4 +5.9;0.61;0.08;2.1;0.071;16;24;0.99376;3.56;0.77;11.1;6 +10.4;0.44;0.42;1.5;0.145;34;48;0.99832;3.38;0.86;9.9;3 +11.6;0.47;0.44;1.6;0.147;36;51;0.99836;3.38;0.86;9.9;4 +8.8;0.685;0.26;1.6;0.088;16;23;0.99694;3.32;0.47;9.4;5 +7.6;0.665;0.1;1.5;0.066;27;55;0.99655;3.39;0.51;9.3;5 +6.7;0.28;0.28;2.4;0.012;36;100;0.99064;3.26;0.39;11.7;7 +6.7;0.28;0.28;2.4;0.012;36;100;0.99064;3.26;0.39;11.7;7 +10.1;0.31;0.35;1.6;0.075;9;28;0.99672;3.24;0.83;11.2;7 +6;0.5;0.04;2.2;0.092;13;26;0.99647;3.46;0.47;10;5 +11.1;0.42;0.47;2.65;0.085;9;34;0.99736;3.24;0.77;12.1;7 +6.6;0.66;0;3;0.115;21;31;0.99629;3.45;0.63;10.3;5 +10.6;0.5;0.45;2.6;0.119;34;68;0.99708;3.23;0.72;10.9;6 +7.1;0.685;0.35;2;0.088;9;92;0.9963;3.28;0.62;9.4;5 +9.9;0.25;0.46;1.7;0.062;26;42;0.9959;3.18;0.83;10.6;6 +6.4;0.64;0.21;1.8;0.081;14;31;0.99689;3.59;0.66;9.8;5 +6.4;0.64;0.21;1.8;0.081;14;31;0.99689;3.59;0.66;9.8;5 +7.4;0.68;0.16;1.8;0.078;12;39;0.9977;3.5;0.7;9.9;6 +6.4;0.64;0.21;1.8;0.081;14;31;0.99689;3.59;0.66;9.8;5 +6.4;0.63;0.21;1.6;0.08;12;32;0.99689;3.58;0.66;9.8;5 +9.3;0.43;0.44;1.9;0.085;9;22;0.99708;3.28;0.55;9.5;5 +9.3;0.43;0.44;1.9;0.085;9;22;0.99708;3.28;0.55;9.5;5 +8;0.42;0.32;2.5;0.08;26;122;0.99801;3.22;1.07;9.7;5 +9.3;0.36;0.39;1.5;0.08;41;55;0.99652;3.47;0.73;10.9;6 +9.3;0.36;0.39;1.5;0.08;41;55;0.99652;3.47;0.73;10.9;6 +7.6;0.735;0.02;2.5;0.071;10;14;0.99538;3.51;0.71;11.7;7 +9.3;0.36;0.39;1.5;0.08;41;55;0.99652;3.47;0.73;10.9;6 +8.2;0.26;0.34;2.5;0.073;16;47;0.99594;3.4;0.78;11.3;7 +11.7;0.28;0.47;1.7;0.054;17;32;0.99686;3.15;0.67;10.6;7 +6.8;0.56;0.22;1.8;0.074;15;24;0.99438;3.4;0.82;11.2;6 +7.2;0.62;0.06;2.7;0.077;15;85;0.99746;3.51;0.54;9.5;5 +5.8;1.01;0.66;2;0.039;15;88;0.99357;3.66;0.6;11.5;6 +7.5;0.42;0.32;2.7;0.067;7;25;0.99628;3.24;0.44;10.4;5 +7.2;0.62;0.06;2.5;0.078;17;84;0.99746;3.51;0.53;9.7;5 +7.2;0.62;0.06;2.7;0.077;15;85;0.99746;3.51;0.54;9.5;5 +7.2;0.635;0.07;2.6;0.077;16;86;0.99748;3.51;0.54;9.7;5 +6.8;0.49;0.22;2.3;0.071;13;24;0.99438;3.41;0.83;11.3;6 +6.9;0.51;0.23;2;0.072;13;22;0.99438;3.4;0.84;11.2;6 +6.8;0.56;0.22;1.8;0.074;15;24;0.99438;3.4;0.82;11.2;6 +7.6;0.63;0.03;2;0.08;27;43;0.99578;3.44;0.64;10.9;6 +7.7;0.715;0.01;2.1;0.064;31;43;0.99371;3.41;0.57;11.8;6 +6.9;0.56;0.03;1.5;0.086;36;46;0.99522;3.53;0.57;10.6;5 +7.3;0.35;0.24;2;0.067;28;48;0.99576;3.43;0.54;10;4 +9.1;0.21;0.37;1.6;0.067;6;10;0.99552;3.23;0.58;11.1;7 +10.4;0.38;0.46;2.1;0.104;6;10;0.99664;3.12;0.65;11.8;7 +8.8;0.31;0.4;2.8;0.109;7;16;0.99614;3.31;0.79;11.8;7 +7.1;0.47;0;2.2;0.067;7;14;0.99517;3.4;0.58;10.9;4 +7.7;0.715;0.01;2.1;0.064;31;43;0.99371;3.41;0.57;11.8;6 +8.8;0.61;0.19;4;0.094;30;69;0.99787;3.22;0.5;10;6 +7.2;0.6;0.04;2.5;0.076;18;88;0.99745;3.53;0.55;9.5;5 +9.2;0.56;0.18;1.6;0.078;10;21;0.99576;3.15;0.49;9.9;5 +7.6;0.715;0;2.1;0.068;30;35;0.99533;3.48;0.65;11.4;6 +8.4;0.31;0.29;3.1;0.194;14;26;0.99536;3.22;0.78;12;6 +7.2;0.6;0.04;2.5;0.076;18;88;0.99745;3.53;0.55;9.5;5 +8.8;0.61;0.19;4;0.094;30;69;0.99787;3.22;0.5;10;6 +8.9;0.75;0.14;2.5;0.086;9;30;0.99824;3.34;0.64;10.5;5 +9;0.8;0.12;2.4;0.083;8;28;0.99836;3.33;0.65;10.4;6 +10.7;0.52;0.38;2.6;0.066;29;56;0.99577;3.15;0.79;12.1;7 +6.8;0.57;0;2.5;0.072;32;64;0.99491;3.43;0.56;11.2;6 +10.7;0.9;0.34;6.6;0.112;23;99;1.00289;3.22;0.68;9.3;5 +7.2;0.34;0.24;2;0.071;30;52;0.99576;3.44;0.58;10.1;5 +7.2;0.66;0.03;2.3;0.078;16;86;0.99743;3.53;0.57;9.7;5 +10.1;0.45;0.23;1.9;0.082;10;18;0.99774;3.22;0.65;9.3;6 +7.2;0.66;0.03;2.3;0.078;16;86;0.99743;3.53;0.57;9.7;5 +7.2;0.63;0.03;2.2;0.08;17;88;0.99745;3.53;0.58;9.8;6 +7.1;0.59;0.01;2.3;0.08;27;43;0.9955;3.42;0.58;10.7;6 +8.3;0.31;0.39;2.4;0.078;17;43;0.99444;3.31;0.77;12.5;7 +7.1;0.59;0.01;2.3;0.08;27;43;0.9955;3.42;0.58;10.7;6 +8.3;0.31;0.39;2.4;0.078;17;43;0.99444;3.31;0.77;12.5;7 +8.3;1.02;0.02;3.4;0.084;6;11;0.99892;3.48;0.49;11;3 +8.9;0.31;0.36;2.6;0.056;10;39;0.99562;3.4;0.69;11.8;5 +7.4;0.635;0.1;2.4;0.08;16;33;0.99736;3.58;0.69;10.8;7 +7.4;0.635;0.1;2.4;0.08;16;33;0.99736;3.58;0.69;10.8;7 +6.8;0.59;0.06;6;0.06;11;18;0.9962;3.41;0.59;10.8;7 +6.8;0.59;0.06;6;0.06;11;18;0.9962;3.41;0.59;10.8;7 +9.2;0.58;0.2;3;0.081;15;115;0.998;3.23;0.59;9.5;5 +7.2;0.54;0.27;2.6;0.084;12;78;0.9964;3.39;0.71;11;5 +6.1;0.56;0;2.2;0.079;6;9;0.9948;3.59;0.54;11.5;6 +7.4;0.52;0.13;2.4;0.078;34;61;0.99528;3.43;0.59;10.8;6 +7.3;0.305;0.39;1.2;0.059;7;11;0.99331;3.29;0.52;11.5;6 +9.3;0.38;0.48;3.8;0.132;3;11;0.99577;3.23;0.57;13.2;6 +9.1;0.28;0.46;9;0.114;3;9;0.99901;3.18;0.6;10.9;6 +10;0.46;0.44;2.9;0.065;4;8;0.99674;3.33;0.62;12.2;6 +9.4;0.395;0.46;4.6;0.094;3;10;0.99639;3.27;0.64;12.2;7 +7.3;0.305;0.39;1.2;0.059;7;11;0.99331;3.29;0.52;11.5;6 +8.6;0.315;0.4;2.2;0.079;3;6;0.99512;3.27;0.67;11.9;6 +5.3;0.715;0.19;1.5;0.161;7;62;0.99395;3.62;0.61;11;5 +6.8;0.41;0.31;8.8;0.084;26;45;0.99824;3.38;0.64;10.1;6 +8.4;0.36;0.32;2.2;0.081;32;79;0.9964;3.3;0.72;11;6 +8.4;0.62;0.12;1.8;0.072;38;46;0.99504;3.38;0.89;11.8;6 +9.6;0.41;0.37;2.3;0.091;10;23;0.99786;3.24;0.56;10.5;5 +8.4;0.36;0.32;2.2;0.081;32;79;0.9964;3.3;0.72;11;6 +8.4;0.62;0.12;1.8;0.072;38;46;0.99504;3.38;0.89;11.8;6 +6.8;0.41;0.31;8.8;0.084;26;45;0.99824;3.38;0.64;10.1;6 +8.6;0.47;0.27;2.3;0.055;14;28;0.99516;3.18;0.8;11.2;5 +8.6;0.22;0.36;1.9;0.064;53;77;0.99604;3.47;0.87;11;7 +9.4;0.24;0.33;2.3;0.061;52;73;0.99786;3.47;0.9;10.2;6 +8.4;0.67;0.19;2.2;0.093;11;75;0.99736;3.2;0.59;9.2;4 +8.6;0.47;0.27;2.3;0.055;14;28;0.99516;3.18;0.8;11.2;5 +8.7;0.33;0.38;3.3;0.063;10;19;0.99468;3.3;0.73;12;7 +6.6;0.61;0.01;1.9;0.08;8;25;0.99746;3.69;0.73;10.5;5 +7.4;0.61;0.01;2;0.074;13;38;0.99748;3.48;0.65;9.8;5 +7.6;0.4;0.29;1.9;0.078;29;66;0.9971;3.45;0.59;9.5;6 +7.4;0.61;0.01;2;0.074;13;38;0.99748;3.48;0.65;9.8;5 +6.6;0.61;0.01;1.9;0.08;8;25;0.99746;3.69;0.73;10.5;5 +8.8;0.3;0.38;2.3;0.06;19;72;0.99543;3.39;0.72;11.8;6 +8.8;0.3;0.38;2.3;0.06;19;72;0.99543;3.39;0.72;11.8;6 +12;0.63;0.5;1.4;0.071;6;26;0.99791;3.07;0.6;10.4;4 +7.2;0.38;0.38;2.8;0.068;23;42;0.99356;3.34;0.72;12.9;7 +6.2;0.46;0.17;1.6;0.073;7;11;0.99425;3.61;0.54;11.4;5 +9.6;0.33;0.52;2.2;0.074;13;25;0.99509;3.36;0.76;12.4;7 +9.9;0.27;0.49;5;0.082;9;17;0.99484;3.19;0.52;12.5;7 +10.1;0.43;0.4;2.6;0.092;13;52;0.99834;3.22;0.64;10;7 +9.8;0.5;0.34;2.3;0.094;10;45;0.99864;3.24;0.6;9.7;7 +8.3;0.3;0.49;3.8;0.09;11;24;0.99498;3.27;0.64;12.1;7 +10.2;0.44;0.42;2;0.071;7;20;0.99566;3.14;0.79;11.1;7 +10.2;0.44;0.58;4.1;0.092;11;24;0.99745;3.29;0.99;12;7 +8.3;0.28;0.48;2.1;0.093;6;12;0.99408;3.26;0.62;12.4;7 +8.9;0.12;0.45;1.8;0.075;10;21;0.99552;3.41;0.76;11.9;7 +8.9;0.12;0.45;1.8;0.075;10;21;0.99552;3.41;0.76;11.9;7 +8.9;0.12;0.45;1.8;0.075;10;21;0.99552;3.41;0.76;11.9;7 +8.3;0.28;0.48;2.1;0.093;6;12;0.99408;3.26;0.62;12.4;7 +8.2;0.31;0.4;2.2;0.058;6;10;0.99536;3.31;0.68;11.2;7 +10.2;0.34;0.48;2.1;0.052;5;9;0.99458;3.2;0.69;12.1;7 +7.6;0.43;0.4;2.7;0.082;6;11;0.99538;3.44;0.54;12.2;6 +8.5;0.21;0.52;1.9;0.09;9;23;0.99648;3.36;0.67;10.4;5 +9;0.36;0.52;2.1;0.111;5;10;0.99568;3.31;0.62;11.3;6 +9.5;0.37;0.52;2;0.088;12;51;0.99613;3.29;0.58;11.1;6 +6.4;0.57;0.12;2.3;0.12;25;36;0.99519;3.47;0.71;11.3;7 +8;0.59;0.05;2;0.089;12;32;0.99735;3.36;0.61;10;5 +8.5;0.47;0.27;1.9;0.058;18;38;0.99518;3.16;0.85;11.1;6 +7.1;0.56;0.14;1.6;0.078;7;18;0.99592;3.27;0.62;9.3;5 +6.6;0.57;0.02;2.1;0.115;6;16;0.99654;3.38;0.69;9.5;5 +8.8;0.27;0.39;2;0.1;20;27;0.99546;3.15;0.69;11.2;6 +8.5;0.47;0.27;1.9;0.058;18;38;0.99518;3.16;0.85;11.1;6 +8.3;0.34;0.4;2.4;0.065;24;48;0.99554;3.34;0.86;11;6 +9;0.38;0.41;2.4;0.103;6;10;0.99604;3.13;0.58;11.9;7 +8.5;0.66;0.2;2.1;0.097;23;113;0.99733;3.13;0.48;9.2;5 +9;0.4;0.43;2.4;0.068;29;46;0.9943;3.2;0.6;12.2;6 +6.7;0.56;0.09;2.9;0.079;7;22;0.99669;3.46;0.61;10.2;5 +10.4;0.26;0.48;1.9;0.066;6;10;0.99724;3.33;0.87;10.9;6 +10.4;0.26;0.48;1.9;0.066;6;10;0.99724;3.33;0.87;10.9;6 +10.1;0.38;0.5;2.4;0.104;6;13;0.99643;3.22;0.65;11.6;7 +8.5;0.34;0.44;1.7;0.079;6;12;0.99605;3.52;0.63;10.7;5 +8.8;0.33;0.41;5.9;0.073;7;13;0.99658;3.3;0.62;12.1;7 +7.2;0.41;0.3;2.1;0.083;35;72;0.997;3.44;0.52;9.4;5 +7.2;0.41;0.3;2.1;0.083;35;72;0.997;3.44;0.52;9.4;5 +8.4;0.59;0.29;2.6;0.109;31;119;0.99801;3.15;0.5;9.1;5 +7;0.4;0.32;3.6;0.061;9;29;0.99416;3.28;0.49;11.3;7 +12.2;0.45;0.49;1.4;0.075;3;6;0.9969;3.13;0.63;10.4;5 +9.1;0.5;0.3;1.9;0.065;8;17;0.99774;3.32;0.71;10.5;6 +9.5;0.86;0.26;1.9;0.079;13;28;0.99712;3.25;0.62;10;5 +7.3;0.52;0.32;2.1;0.07;51;70;0.99418;3.34;0.82;12.9;6 +9.1;0.5;0.3;1.9;0.065;8;17;0.99774;3.32;0.71;10.5;6 +12.2;0.45;0.49;1.4;0.075;3;6;0.9969;3.13;0.63;10.4;5 +7.4;0.58;0;2;0.064;7;11;0.99562;3.45;0.58;11.3;6 +9.8;0.34;0.39;1.4;0.066;3;7;0.9947;3.19;0.55;11.4;7 +7.1;0.36;0.3;1.6;0.08;35;70;0.99693;3.44;0.5;9.4;5 +7.7;0.39;0.12;1.7;0.097;19;27;0.99596;3.16;0.49;9.4;5 +9.7;0.295;0.4;1.5;0.073;14;21;0.99556;3.14;0.51;10.9;6 +7.7;0.39;0.12;1.7;0.097;19;27;0.99596;3.16;0.49;9.4;5 +7.1;0.34;0.28;2;0.082;31;68;0.99694;3.45;0.48;9.4;5 +6.5;0.4;0.1;2;0.076;30;47;0.99554;3.36;0.48;9.4;6 +7.1;0.34;0.28;2;0.082;31;68;0.99694;3.45;0.48;9.4;5 +10;0.35;0.45;2.5;0.092;20;88;0.99918;3.15;0.43;9.4;5 +7.7;0.6;0.06;2;0.079;19;41;0.99697;3.39;0.62;10.1;6 +5.6;0.66;0;2.2;0.087;3;11;0.99378;3.71;0.63;12.8;7 +5.6;0.66;0;2.2;0.087;3;11;0.99378;3.71;0.63;12.8;7 +8.9;0.84;0.34;1.4;0.05;4;10;0.99554;3.12;0.48;9.1;6 +6.4;0.69;0;1.65;0.055;7;12;0.99162;3.47;0.53;12.9;6 +7.5;0.43;0.3;2.2;0.062;6;12;0.99495;3.44;0.72;11.5;7 +9.9;0.35;0.38;1.5;0.058;31;47;0.99676;3.26;0.82;10.6;7 +9.1;0.29;0.33;2.05;0.063;13;27;0.99516;3.26;0.84;11.7;7 +6.8;0.36;0.32;1.8;0.067;4;8;0.9928;3.36;0.55;12.8;7 +8.2;0.43;0.29;1.6;0.081;27;45;0.99603;3.25;0.54;10.3;5 +6.8;0.36;0.32;1.8;0.067;4;8;0.9928;3.36;0.55;12.8;7 +9.1;0.29;0.33;2.05;0.063;13;27;0.99516;3.26;0.84;11.7;7 +9.1;0.3;0.34;2;0.064;12;25;0.99516;3.26;0.84;11.7;7 +8.9;0.35;0.4;3.6;0.11;12;24;0.99549;3.23;0.7;12;7 +9.6;0.5;0.36;2.8;0.116;26;55;0.99722;3.18;0.68;10.9;5 +8.9;0.28;0.45;1.7;0.067;7;12;0.99354;3.25;0.55;12.3;7 +8.9;0.32;0.31;2;0.088;12;19;0.9957;3.17;0.55;10.4;6 +7.7;1.005;0.15;2.1;0.102;11;32;0.99604;3.23;0.48;10;5 +7.5;0.71;0;1.6;0.092;22;31;0.99635;3.38;0.58;10;6 +8;0.58;0.16;2;0.12;3;7;0.99454;3.22;0.58;11.2;6 +10.5;0.39;0.46;2.2;0.075;14;27;0.99598;3.06;0.84;11.4;6 +8.9;0.38;0.4;2.2;0.068;12;28;0.99486;3.27;0.75;12.6;7 +8;0.18;0.37;0.9;0.049;36;109;0.99007;2.89;0.44;12.7;6 +8;0.18;0.37;0.9;0.049;36;109;0.99007;2.89;0.44;12.7;6 +7;0.5;0.14;1.8;0.078;10;23;0.99636;3.53;0.61;10.4;5 +11.3;0.36;0.66;2.4;0.123;3;8;0.99642;3.2;0.53;11.9;6 +11.3;0.36;0.66;2.4;0.123;3;8;0.99642;3.2;0.53;11.9;6 +7;0.51;0.09;2.1;0.062;4;9;0.99584;3.35;0.54;10.5;5 +8.2;0.32;0.42;2.3;0.098;3;9;0.99506;3.27;0.55;12.3;6 +7.7;0.58;0.01;1.8;0.088;12;18;0.99568;3.32;0.56;10.5;7 +8.6;0.83;0;2.8;0.095;17;43;0.99822;3.33;0.6;10.4;6 +7.9;0.31;0.32;1.9;0.066;14;36;0.99364;3.41;0.56;12.6;6 +6.4;0.795;0;2.2;0.065;28;52;0.99378;3.49;0.52;11.6;5 +7.2;0.34;0.21;2.5;0.075;41;68;0.99586;3.37;0.54;10.1;6 +7.7;0.58;0.01;1.8;0.088;12;18;0.99568;3.32;0.56;10.5;7 +7.1;0.59;0;2.1;0.091;9;14;0.99488;3.42;0.55;11.5;7 +7.3;0.55;0.01;1.8;0.093;9;15;0.99514;3.35;0.58;11;7 +8.1;0.82;0;4.1;0.095;5;14;0.99854;3.36;0.53;9.6;5 +7.5;0.57;0.08;2.6;0.089;14;27;0.99592;3.3;0.59;10.4;6 +8.9;0.745;0.18;2.5;0.077;15;48;0.99739;3.2;0.47;9.7;6 +10.1;0.37;0.34;2.4;0.085;5;17;0.99683;3.17;0.65;10.6;7 +7.6;0.31;0.34;2.5;0.082;26;35;0.99356;3.22;0.59;12.5;7 +7.3;0.91;0.1;1.8;0.074;20;56;0.99672;3.35;0.56;9.2;5 +8.7;0.41;0.41;6.2;0.078;25;42;0.9953;3.24;0.77;12.6;7 +8.9;0.5;0.21;2.2;0.088;21;39;0.99692;3.33;0.83;11.1;6 +7.4;0.965;0;2.2;0.088;16;32;0.99756;3.58;0.67;10.2;5 +6.9;0.49;0.19;1.7;0.079;13;26;0.99547;3.38;0.64;9.8;6 +8.9;0.5;0.21;2.2;0.088;21;39;0.99692;3.33;0.83;11.1;6 +9.5;0.39;0.41;8.9;0.069;18;39;0.99859;3.29;0.81;10.9;7 +6.4;0.39;0.33;3.3;0.046;12;53;0.99294;3.36;0.62;12.2;6 +6.9;0.44;0;1.4;0.07;32;38;0.99438;3.32;0.58;11.4;6 +7.6;0.78;0;1.7;0.076;33;45;0.99612;3.31;0.62;10.7;6 +7.1;0.43;0.17;1.8;0.082;27;51;0.99634;3.49;0.64;10.4;5 +9.3;0.49;0.36;1.7;0.081;3;14;0.99702;3.27;0.78;10.9;6 +9.3;0.5;0.36;1.8;0.084;6;17;0.99704;3.27;0.77;10.8;6 +7.1;0.43;0.17;1.8;0.082;27;51;0.99634;3.49;0.64;10.4;5 +8.5;0.46;0.59;1.4;0.414;16;45;0.99702;3.03;1.34;9.2;5 +5.6;0.605;0.05;2.4;0.073;19;25;0.99258;3.56;0.55;12.9;5 +8.3;0.33;0.42;2.3;0.07;9;20;0.99426;3.38;0.77;12.7;7 +8.2;0.64;0.27;2;0.095;5;77;0.99747;3.13;0.62;9.1;6 +8.2;0.64;0.27;2;0.095;5;77;0.99747;3.13;0.62;9.1;6 +8.9;0.48;0.53;4;0.101;3;10;0.99586;3.21;0.59;12.1;7 +7.6;0.42;0.25;3.9;0.104;28;90;0.99784;3.15;0.57;9.1;5 +9.9;0.53;0.57;2.4;0.093;30;52;0.9971;3.19;0.76;11.6;7 +8.9;0.48;0.53;4;0.101;3;10;0.99586;3.21;0.59;12.1;7 +11.6;0.23;0.57;1.8;0.074;3;8;0.9981;3.14;0.7;9.9;6 +9.1;0.4;0.5;1.8;0.071;7;16;0.99462;3.21;0.69;12.5;8 +8;0.38;0.44;1.9;0.098;6;15;0.9956;3.3;0.64;11.4;6 +10.2;0.29;0.65;2.4;0.075;6;17;0.99565;3.22;0.63;11.8;6 +8.2;0.74;0.09;2;0.067;5;10;0.99418;3.28;0.57;11.8;6 +7.7;0.61;0.18;2.4;0.083;6;20;0.9963;3.29;0.6;10.2;6 +6.6;0.52;0.08;2.4;0.07;13;26;0.99358;3.4;0.72;12.5;7 +11.1;0.31;0.53;2.2;0.06;3;10;0.99572;3.02;0.83;10.9;7 +11.1;0.31;0.53;2.2;0.06;3;10;0.99572;3.02;0.83;10.9;7 +8;0.62;0.35;2.8;0.086;28;52;0.997;3.31;0.62;10.8;5 +9.3;0.33;0.45;1.5;0.057;19;37;0.99498;3.18;0.89;11.1;7 +7.5;0.77;0.2;8.1;0.098;30;92;0.99892;3.2;0.58;9.2;5 +7.2;0.35;0.26;1.8;0.083;33;75;0.9968;3.4;0.58;9.5;6 +8;0.62;0.33;2.7;0.088;16;37;0.9972;3.31;0.58;10.7;6 +7.5;0.77;0.2;8.1;0.098;30;92;0.99892;3.2;0.58;9.2;5 +9.1;0.25;0.34;2;0.071;45;67;0.99769;3.44;0.86;10.2;7 +9.9;0.32;0.56;2;0.073;3;8;0.99534;3.15;0.73;11.4;6 +8.6;0.37;0.65;6.4;0.08;3;8;0.99817;3.27;0.58;11;5 +8.6;0.37;0.65;6.4;0.08;3;8;0.99817;3.27;0.58;11;5 +7.9;0.3;0.68;8.3;0.05;37.5;278;0.99316;3.01;0.51;12.3;7 +10.3;0.27;0.56;1.4;0.047;3;8;0.99471;3.16;0.51;11.8;6 +7.9;0.3;0.68;8.3;0.05;37.5;289;0.99316;3.01;0.51;12.3;7 +7.2;0.38;0.3;1.8;0.073;31;70;0.99685;3.42;0.59;9.5;6 +8.7;0.42;0.45;2.4;0.072;32;59;0.99617;3.33;0.77;12;6 +7.2;0.38;0.3;1.8;0.073;31;70;0.99685;3.42;0.59;9.5;6 +6.8;0.48;0.08;1.8;0.074;40;64;0.99529;3.12;0.49;9.6;5 +8.5;0.34;0.4;4.7;0.055;3;9;0.99738;3.38;0.66;11.6;7 +7.9;0.19;0.42;1.6;0.057;18;30;0.994;3.29;0.69;11.2;6 +11.6;0.41;0.54;1.5;0.095;22;41;0.99735;3.02;0.76;9.9;7 +11.6;0.41;0.54;1.5;0.095;22;41;0.99735;3.02;0.76;9.9;7 +10;0.26;0.54;1.9;0.083;42;74;0.99451;2.98;0.63;11.8;8 +7.9;0.34;0.42;2;0.086;8;19;0.99546;3.35;0.6;11.4;6 +7;0.54;0.09;2;0.081;10;16;0.99479;3.43;0.59;11.5;6 +9.2;0.31;0.36;2.2;0.079;11;31;0.99615;3.33;0.86;12;7 +6.6;0.725;0.09;5.5;0.117;9;17;0.99655;3.35;0.49;10.8;6 +9.4;0.4;0.47;2.5;0.087;6;20;0.99772;3.15;0.5;10.5;5 +6.6;0.725;0.09;5.5;0.117;9;17;0.99655;3.35;0.49;10.8;6 +8.6;0.52;0.38;1.5;0.096;5;18;0.99666;3.2;0.52;9.4;5 +8;0.31;0.45;2.1;0.216;5;16;0.99358;3.15;0.81;12.5;7 +8.6;0.52;0.38;1.5;0.096;5;18;0.99666;3.2;0.52;9.4;5 +8.4;0.34;0.42;2.1;0.072;23;36;0.99392;3.11;0.78;12.4;6 +7.4;0.49;0.27;2.1;0.071;14;25;0.99388;3.35;0.63;12;6 +6.1;0.48;0.09;1.7;0.078;18;30;0.99402;3.45;0.54;11.2;6 +7.4;0.49;0.27;2.1;0.071;14;25;0.99388;3.35;0.63;12;6 +8;0.48;0.34;2.2;0.073;16;25;0.9936;3.28;0.66;12.4;6 +6.3;0.57;0.28;2.1;0.048;13;49;0.99374;3.41;0.6;12.8;5 +8.2;0.23;0.42;1.9;0.069;9;17;0.99376;3.21;0.54;12.3;6 +9.1;0.3;0.41;2;0.068;10;24;0.99523;3.27;0.85;11.7;7 +8.1;0.78;0.1;3.3;0.09;4;13;0.99855;3.36;0.49;9.5;5 +10.8;0.47;0.43;2.1;0.171;27;66;0.9982;3.17;0.76;10.8;6 +8.3;0.53;0;1.4;0.07;6;14;0.99593;3.25;0.64;10;6 +5.4;0.42;0.27;2;0.092;23;55;0.99471;3.78;0.64;12.3;7 +7.9;0.33;0.41;1.5;0.056;6;35;0.99396;3.29;0.71;11;6 +8.9;0.24;0.39;1.6;0.074;3;10;0.99698;3.12;0.59;9.5;6 +5;0.4;0.5;4.3;0.046;29;80;0.9902;3.49;0.66;13.6;6 +7;0.69;0.07;2.5;0.091;15;21;0.99572;3.38;0.6;11.3;6 +7;0.69;0.07;2.5;0.091;15;21;0.99572;3.38;0.6;11.3;6 +7;0.69;0.07;2.5;0.091;15;21;0.99572;3.38;0.6;11.3;6 +7.1;0.39;0.12;2.1;0.065;14;24;0.99252;3.3;0.53;13.3;6 +5.6;0.66;0;2.5;0.066;7;15;0.99256;3.52;0.58;12.9;5 +7.9;0.54;0.34;2.5;0.076;8;17;0.99235;3.2;0.72;13.1;8 +6.6;0.5;0;1.8;0.062;21;28;0.99352;3.44;0.55;12.3;6 +6.3;0.47;0;1.4;0.055;27;33;0.9922;3.45;0.48;12.3;6 +10.7;0.4;0.37;1.9;0.081;17;29;0.99674;3.12;0.65;11.2;6 +6.5;0.58;0;2.2;0.096;3;13;0.99557;3.62;0.62;11.5;4 +8.8;0.24;0.35;1.7;0.055;13;27;0.99394;3.14;0.59;11.3;7 +5.8;0.29;0.26;1.7;0.063;3;11;0.9915;3.39;0.54;13.5;6 +6.3;0.76;0;2.9;0.072;26;52;0.99379;3.51;0.6;11.5;6 +10;0.43;0.33;2.7;0.095;28;89;0.9984;3.22;0.68;10;5 +10.5;0.43;0.35;3.3;0.092;24;70;0.99798;3.21;0.69;10.5;6 +9.1;0.6;0;1.9;0.058;5;10;0.9977;3.18;0.63;10.4;6 +5.9;0.19;0.21;1.7;0.045;57;135;0.99341;3.32;0.44;9.5;5 +7.4;0.36;0.34;1.8;0.075;18;38;0.9933;3.38;0.88;13.6;7 +7.2;0.48;0.07;5.5;0.089;10;18;0.99684;3.37;0.68;11.2;7 +8.5;0.28;0.35;1.7;0.061;6;15;0.99524;3.3;0.74;11.8;7 +8;0.25;0.43;1.7;0.067;22;50;0.9946;3.38;0.6;11.9;6 +10.4;0.52;0.45;2;0.08;6;13;0.99774;3.22;0.76;11.4;6 +10.4;0.52;0.45;2;0.08;6;13;0.99774;3.22;0.76;11.4;6 +7.5;0.41;0.15;3.7;0.104;29;94;0.99786;3.14;0.58;9.1;5 +8.2;0.51;0.24;2;0.079;16;86;0.99764;3.34;0.64;9.5;6 +7.3;0.4;0.3;1.7;0.08;33;79;0.9969;3.41;0.65;9.5;6 +8.2;0.38;0.32;2.5;0.08;24;71;0.99624;3.27;0.85;11;6 +6.9;0.45;0.11;2.4;0.043;6;12;0.99354;3.3;0.65;11.4;6 +7;0.22;0.3;1.8;0.065;16;20;0.99672;3.61;0.82;10;6 +7.3;0.32;0.23;2.3;0.066;35;70;0.99588;3.43;0.62;10.1;5 +8.2;0.2;0.43;2.5;0.076;31;51;0.99672;3.53;0.81;10.4;6 +7.8;0.5;0.12;1.8;0.178;6;21;0.996;3.28;0.87;9.8;6 +10;0.41;0.45;6.2;0.071;6;14;0.99702;3.21;0.49;11.8;7 +7.8;0.39;0.42;2;0.086;9;21;0.99526;3.39;0.66;11.6;6 +10;0.35;0.47;2;0.061;6;11;0.99585;3.23;0.52;12;6 +8.2;0.33;0.32;2.8;0.067;4;12;0.99473;3.3;0.76;12.8;7 +6.1;0.58;0.23;2.5;0.044;16;70;0.99352;3.46;0.65;12.5;6 +8.3;0.6;0.25;2.2;0.118;9;38;0.99616;3.15;0.53;9.8;5 +9.6;0.42;0.35;2.1;0.083;17;38;0.99622;3.23;0.66;11.1;6 +6.6;0.58;0;2.2;0.1;50;63;0.99544;3.59;0.68;11.4;6 +8.3;0.6;0.25;2.2;0.118;9;38;0.99616;3.15;0.53;9.8;5 +8.5;0.18;0.51;1.75;0.071;45;88;0.99524;3.33;0.76;11.8;7 +5.1;0.51;0.18;2.1;0.042;16;101;0.9924;3.46;0.87;12.9;7 +6.7;0.41;0.43;2.8;0.076;22;54;0.99572;3.42;1.16;10.6;6 +10.2;0.41;0.43;2.2;0.11;11;37;0.99728;3.16;0.67;10.8;5 +10.6;0.36;0.57;2.3;0.087;6;20;0.99676;3.14;0.72;11.1;7 +8.8;0.45;0.43;1.4;0.076;12;21;0.99551;3.21;0.75;10.2;6 +8.5;0.32;0.42;2.3;0.075;12;19;0.99434;3.14;0.71;11.8;7 +9;0.785;0.24;1.7;0.078;10;21;0.99692;3.29;0.67;10;5 +9;0.785;0.24;1.7;0.078;10;21;0.99692;3.29;0.67;10;5 +8.5;0.44;0.5;1.9;0.369;15;38;0.99634;3.01;1.1;9.4;5 +9.9;0.54;0.26;2;0.111;7;60;0.99709;2.94;0.98;10.2;5 +8.2;0.33;0.39;2.5;0.074;29;48;0.99528;3.32;0.88;12.4;7 +6.5;0.34;0.27;2.8;0.067;8;44;0.99384;3.21;0.56;12;6 +7.6;0.5;0.29;2.3;0.086;5;14;0.99502;3.32;0.62;11.5;6 +9.2;0.36;0.34;1.6;0.062;5;12;0.99667;3.2;0.67;10.5;6 +7.1;0.59;0;2.2;0.078;26;44;0.99522;3.42;0.68;10.8;6 +9.7;0.42;0.46;2.1;0.074;5;16;0.99649;3.27;0.74;12.3;6 +7.6;0.36;0.31;1.7;0.079;26;65;0.99716;3.46;0.62;9.5;6 +7.6;0.36;0.31;1.7;0.079;26;65;0.99716;3.46;0.62;9.5;6 +6.5;0.61;0;2.2;0.095;48;59;0.99541;3.61;0.7;11.5;6 +6.5;0.88;0.03;5.6;0.079;23;47;0.99572;3.58;0.5;11.2;4 +7.1;0.66;0;2.4;0.052;6;11;0.99318;3.35;0.66;12.7;7 +5.6;0.915;0;2.1;0.041;17;78;0.99346;3.68;0.73;11.4;5 +8.2;0.35;0.33;2.4;0.076;11;47;0.99599;3.27;0.81;11;6 +8.2;0.35;0.33;2.4;0.076;11;47;0.99599;3.27;0.81;11;6 +9.8;0.39;0.43;1.65;0.068;5;11;0.99478;3.19;0.46;11.4;5 +10.2;0.4;0.4;2.5;0.068;41;54;0.99754;3.38;0.86;10.5;6 +6.8;0.66;0.07;1.6;0.07;16;61;0.99572;3.29;0.6;9.3;5 +6.7;0.64;0.23;2.1;0.08;11;119;0.99538;3.36;0.7;10.9;5 +7;0.43;0.3;2;0.085;6;39;0.99346;3.33;0.46;11.9;6 +6.6;0.8;0.03;7.8;0.079;6;12;0.9963;3.52;0.5;12.2;5 +7;0.43;0.3;2;0.085;6;39;0.99346;3.33;0.46;11.9;6 +6.7;0.64;0.23;2.1;0.08;11;119;0.99538;3.36;0.7;10.9;5 +8.8;0.955;0.05;1.8;0.075;5;19;0.99616;3.3;0.44;9.6;4 +9.1;0.4;0.57;4.6;0.08;6;20;0.99652;3.28;0.57;12.5;6 +6.5;0.885;0;2.3;0.166;6;12;0.99551;3.56;0.51;10.8;5 +7.2;0.25;0.37;2.5;0.063;11;41;0.99439;3.52;0.8;12.4;7 +6.4;0.885;0;2.3;0.166;6;12;0.99551;3.56;0.51;10.8;5 +7;0.745;0.12;1.8;0.114;15;64;0.99588;3.22;0.59;9.5;6 +6.2;0.43;0.22;1.8;0.078;21;56;0.99633;3.52;0.6;9.5;6 +7.9;0.58;0.23;2.3;0.076;23;94;0.99686;3.21;0.58;9.5;6 +7.7;0.57;0.21;1.5;0.069;4;9;0.99458;3.16;0.54;9.8;6 +7.7;0.26;0.26;2;0.052;19;77;0.9951;3.15;0.79;10.9;6 +7.9;0.58;0.23;2.3;0.076;23;94;0.99686;3.21;0.58;9.5;6 +7.7;0.57;0.21;1.5;0.069;4;9;0.99458;3.16;0.54;9.8;6 +7.9;0.34;0.36;1.9;0.065;5;10;0.99419;3.27;0.54;11.2;7 +8.6;0.42;0.39;1.8;0.068;6;12;0.99516;3.35;0.69;11.7;8 +9.9;0.74;0.19;5.8;0.111;33;76;0.99878;3.14;0.55;9.4;5 +7.2;0.36;0.46;2.1;0.074;24;44;0.99534;3.4;0.85;11;7 +7.2;0.36;0.46;2.1;0.074;24;44;0.99534;3.4;0.85;11;7 +7.2;0.36;0.46;2.1;0.074;24;44;0.99534;3.4;0.85;11;7 +9.9;0.72;0.55;1.7;0.136;24;52;0.99752;3.35;0.94;10;5 +7.2;0.36;0.46;2.1;0.074;24;44;0.99534;3.4;0.85;11;7 +6.2;0.39;0.43;2;0.071;14;24;0.99428;3.45;0.87;11.2;7 +6.8;0.65;0.02;2.1;0.078;8;15;0.99498;3.35;0.62;10.4;6 +6.6;0.44;0.15;2.1;0.076;22;53;0.9957;3.32;0.62;9.3;5 +6.8;0.65;0.02;2.1;0.078;8;15;0.99498;3.35;0.62;10.4;6 +9.6;0.38;0.42;1.9;0.071;5;13;0.99659;3.15;0.75;10.5;6 +10.2;0.33;0.46;1.9;0.081;6;9;0.99628;3.1;0.48;10.4;6 +8.8;0.27;0.46;2.1;0.095;20;29;0.99488;3.26;0.56;11.3;6 +7.9;0.57;0.31;2;0.079;10;79;0.99677;3.29;0.69;9.5;6 +8.2;0.34;0.37;1.9;0.057;43;74;0.99408;3.23;0.81;12;6 +8.2;0.4;0.31;1.9;0.082;8;24;0.996;3.24;0.69;10.6;6 +9;0.39;0.4;1.3;0.044;25;50;0.99478;3.2;0.83;10.9;6 +10.9;0.32;0.52;1.8;0.132;17;44;0.99734;3.28;0.77;11.5;6 +10.9;0.32;0.52;1.8;0.132;17;44;0.99734;3.28;0.77;11.5;6 +8.1;0.53;0.22;2.2;0.078;33;89;0.99678;3.26;0.46;9.6;6 +10.5;0.36;0.47;2.2;0.074;9;23;0.99638;3.23;0.76;12;6 +12.6;0.39;0.49;2.5;0.08;8;20;0.9992;3.07;0.82;10.3;6 +9.2;0.46;0.23;2.6;0.091;18;77;0.99922;3.15;0.51;9.4;5 +7.5;0.58;0.03;4.1;0.08;27;46;0.99592;3.02;0.47;9.2;5 +9;0.58;0.25;2;0.104;8;21;0.99769;3.27;0.72;9.6;5 +5.1;0.42;0;1.8;0.044;18;88;0.99157;3.68;0.73;13.6;7 +7.6;0.43;0.29;2.1;0.075;19;66;0.99718;3.4;0.64;9.5;5 +7.7;0.18;0.34;2.7;0.066;15;58;0.9947;3.37;0.78;11.8;6 +7.8;0.815;0.01;2.6;0.074;48;90;0.99621;3.38;0.62;10.8;5 +7.6;0.43;0.29;2.1;0.075;19;66;0.99718;3.4;0.64;9.5;5 +10.2;0.23;0.37;2.2;0.057;14;36;0.99614;3.23;0.49;9.3;4 +7.1;0.75;0.01;2.2;0.059;11;18;0.99242;3.39;0.4;12.8;6 +6;0.33;0.32;12.9;0.054;6;113;0.99572;3.3;0.56;11.5;4 +7.8;0.55;0;1.7;0.07;7;17;0.99659;3.26;0.64;9.4;6 +7.1;0.75;0.01;2.2;0.059;11;18;0.99242;3.39;0.4;12.8;6 +8.1;0.73;0;2.5;0.081;12;24;0.99798;3.38;0.46;9.6;4 +6.5;0.67;0;4.3;0.057;11;20;0.99488;3.45;0.56;11.8;4 +7.5;0.61;0.2;1.7;0.076;36;60;0.99494;3.1;0.4;9.3;5 +9.8;0.37;0.39;2.5;0.079;28;65;0.99729;3.16;0.59;9.8;5 +9;0.4;0.41;2;0.058;15;40;0.99414;3.22;0.6;12.2;6 +8.3;0.56;0.22;2.4;0.082;10;86;0.9983;3.37;0.62;9.5;5 +5.9;0.29;0.25;13.4;0.067;72;160;0.99721;3.33;0.54;10.3;6 +7.4;0.55;0.19;1.8;0.082;15;34;0.99655;3.49;0.68;10.5;5 +7.4;0.74;0.07;1.7;0.086;15;48;0.99502;3.12;0.48;10;5 +7.4;0.55;0.19;1.8;0.082;15;34;0.99655;3.49;0.68;10.5;5 +6.9;0.41;0.33;2.2;0.081;22;36;0.9949;3.41;0.75;11.1;6 +7.1;0.6;0.01;2.3;0.079;24;37;0.99514;3.4;0.61;10.9;6 +7.1;0.6;0.01;2.3;0.079;24;37;0.99514;3.4;0.61;10.9;6 +7.5;0.58;0.14;2.2;0.077;27;60;0.9963;3.28;0.59;9.8;5 +7.1;0.72;0;1.8;0.123;6;14;0.99627;3.45;0.58;9.8;5 +7.9;0.66;0;1.4;0.096;6;13;0.99569;3.43;0.58;9.5;5 +7.8;0.7;0.06;1.9;0.079;20;35;0.99628;3.4;0.69;10.9;5 +6.1;0.64;0.02;2.4;0.069;26;46;0.99358;3.47;0.45;11;5 +7.5;0.59;0.22;1.8;0.082;43;60;0.99499;3.1;0.42;9.2;5 +7;0.58;0.28;4.8;0.085;12;69;0.99633;3.32;0.7;11;6 +6.8;0.64;0;2.7;0.123;15;33;0.99538;3.44;0.63;11.3;6 +6.8;0.64;0;2.7;0.123;15;33;0.99538;3.44;0.63;11.3;6 +8.6;0.635;0.68;1.8;0.403;19;56;0.99632;3.02;1.15;9.3;5 +6.3;1.02;0;2;0.083;17;24;0.99437;3.59;0.55;11.2;4 +9.8;0.45;0.38;2.5;0.081;34;66;0.99726;3.15;0.58;9.8;5 +8.2;0.78;0;2.2;0.089;13;26;0.9978;3.37;0.46;9.6;4 +8.5;0.37;0.32;1.8;0.066;26;51;0.99456;3.38;0.72;11.8;6 +7.2;0.57;0.05;2.3;0.081;16;36;0.99564;3.38;0.6;10.3;6 +7.2;0.57;0.05;2.3;0.081;16;36;0.99564;3.38;0.6;10.3;6 +10.4;0.43;0.5;2.3;0.068;13;19;0.996;3.1;0.87;11.4;6 +6.9;0.41;0.31;2;0.079;21;51;0.99668;3.47;0.55;9.5;6 +5.5;0.49;0.03;1.8;0.044;28;87;0.9908;3.5;0.82;14;8 +5;0.38;0.01;1.6;0.048;26;60;0.99084;3.7;0.75;14;6 +7.3;0.44;0.2;1.6;0.049;24;64;0.9935;3.38;0.57;11.7;6 +5.9;0.46;0;1.9;0.077;25;44;0.99385;3.5;0.53;11.2;5 +7.5;0.58;0.2;2;0.073;34;44;0.99494;3.1;0.43;9.3;5 +7.8;0.58;0.13;2.1;0.102;17;36;0.9944;3.24;0.53;11.2;6 +8;0.715;0.22;2.3;0.075;13;81;0.99688;3.24;0.54;9.5;6 +8.5;0.4;0.4;6.3;0.05;3;10;0.99566;3.28;0.56;12;4 +7;0.69;0;1.9;0.114;3;10;0.99636;3.35;0.6;9.7;6 +8;0.715;0.22;2.3;0.075;13;81;0.99688;3.24;0.54;9.5;6 +9.8;0.3;0.39;1.7;0.062;3;9;0.9948;3.14;0.57;11.5;7 +7.1;0.46;0.2;1.9;0.077;28;54;0.9956;3.37;0.64;10.4;6 +7.1;0.46;0.2;1.9;0.077;28;54;0.9956;3.37;0.64;10.4;6 +7.9;0.765;0;2;0.084;9;22;0.99619;3.33;0.68;10.9;6 +8.7;0.63;0.28;2.7;0.096;17;69;0.99734;3.26;0.63;10.2;6 +7;0.42;0.19;2.3;0.071;18;36;0.99476;3.39;0.56;10.9;5 +11.3;0.37;0.5;1.8;0.09;20;47;0.99734;3.15;0.57;10.5;5 +7.1;0.16;0.44;2.5;0.068;17;31;0.99328;3.35;0.54;12.4;6 +8;0.6;0.08;2.6;0.056;3;7;0.99286;3.22;0.37;13;5 +7;0.6;0.3;4.5;0.068;20;110;0.99914;3.3;1.17;10.2;5 +7;0.6;0.3;4.5;0.068;20;110;0.99914;3.3;1.17;10.2;5 +7.6;0.74;0;1.9;0.1;6;12;0.99521;3.36;0.59;11;5 +8.2;0.635;0.1;2.1;0.073;25;60;0.99638;3.29;0.75;10.9;6 +5.9;0.395;0.13;2.4;0.056;14;28;0.99362;3.62;0.67;12.4;6 +7.5;0.755;0;1.9;0.084;6;12;0.99672;3.34;0.49;9.7;4 +8.2;0.635;0.1;2.1;0.073;25;60;0.99638;3.29;0.75;10.9;6 +6.6;0.63;0;4.3;0.093;51;77.5;0.99558;3.2;0.45;9.5;5 +6.6;0.63;0;4.3;0.093;51;77.5;0.99558;3.2;0.45;9.5;5 +7.2;0.53;0.14;2.1;0.064;15;29;0.99323;3.35;0.61;12.1;6 +5.7;0.6;0;1.4;0.063;11;18;0.99191;3.45;0.56;12.2;6 +7.6;1.58;0;2.1;0.137;5;9;0.99476;3.5;0.4;10.9;3 +5.2;0.645;0;2.15;0.08;15;28;0.99444;3.78;0.61;12.5;6 +6.7;0.86;0.07;2;0.1;20;57;0.99598;3.6;0.74;11.7;6 +9.1;0.37;0.32;2.1;0.064;4;15;0.99576;3.3;0.8;11.2;6 +8;0.28;0.44;1.8;0.081;28;68;0.99501;3.36;0.66;11.2;5 +7.6;0.79;0.21;2.3;0.087;21;68;0.9955;3.12;0.44;9.2;5 +7.5;0.61;0.26;1.9;0.073;24;88;0.99612;3.3;0.53;9.8;5 +9.7;0.69;0.32;2.5;0.088;22;91;0.9979;3.29;0.62;10.1;5 +6.8;0.68;0.09;3.9;0.068;15;29;0.99524;3.41;0.52;11.1;4 +9.7;0.69;0.32;2.5;0.088;22;91;0.9979;3.29;0.62;10.1;5 +7;0.62;0.1;1.4;0.071;27;63;0.996;3.28;0.61;9.2;5 +7.5;0.61;0.26;1.9;0.073;24;88;0.99612;3.3;0.53;9.8;5 +6.5;0.51;0.15;3;0.064;12;27;0.9929;3.33;0.59;12.8;6 +8;1.18;0.21;1.9;0.083;14;41;0.99532;3.34;0.47;10.5;5 +7;0.36;0.21;2.3;0.086;20;65;0.99558;3.4;0.54;10.1;6 +7;0.36;0.21;2.4;0.086;24;69;0.99556;3.4;0.53;10.1;6 +7.5;0.63;0.27;2;0.083;17;91;0.99616;3.26;0.58;9.8;6 +5.4;0.74;0;1.2;0.041;16;46;0.99258;4.01;0.59;12.5;6 +9.9;0.44;0.46;2.2;0.091;10;41;0.99638;3.18;0.69;11.9;6 +7.5;0.63;0.27;2;0.083;17;91;0.99616;3.26;0.58;9.8;6 +9.1;0.76;0.68;1.7;0.414;18;64;0.99652;2.9;1.33;9.1;6 +9.7;0.66;0.34;2.6;0.094;12;88;0.99796;3.26;0.66;10.1;5 +5;0.74;0;1.2;0.041;16;46;0.99258;4.01;0.59;12.5;6 +9.1;0.34;0.42;1.8;0.058;9;18;0.99392;3.18;0.55;11.4;5 +9.1;0.36;0.39;1.8;0.06;21;55;0.99495;3.18;0.82;11;7 +6.7;0.46;0.24;1.7;0.077;18;34;0.9948;3.39;0.6;10.6;6 +6.7;0.46;0.24;1.7;0.077;18;34;0.9948;3.39;0.6;10.6;6 +6.7;0.46;0.24;1.7;0.077;18;34;0.9948;3.39;0.6;10.6;6 +6.7;0.46;0.24;1.7;0.077;18;34;0.9948;3.39;0.6;10.6;6 +6.5;0.52;0.11;1.8;0.073;13;38;0.9955;3.34;0.52;9.3;5 +7.4;0.6;0.26;2.1;0.083;17;91;0.99616;3.29;0.56;9.8;6 +7.4;0.6;0.26;2.1;0.083;17;91;0.99616;3.29;0.56;9.8;6 +7.8;0.87;0.26;3.8;0.107;31;67;0.99668;3.26;0.46;9.2;5 +8.4;0.39;0.1;1.7;0.075;6;25;0.99581;3.09;0.43;9.7;6 +9.1;0.775;0.22;2.2;0.079;12;48;0.9976;3.18;0.51;9.6;5 +7.2;0.835;0;2;0.166;4;11;0.99608;3.39;0.52;10;5 +6.6;0.58;0.02;2.4;0.069;19;40;0.99387;3.38;0.66;12.6;6 +6;0.5;0;1.4;0.057;15;26;0.99448;3.36;0.45;9.5;5 +6;0.5;0;1.4;0.057;15;26;0.99448;3.36;0.45;9.5;5 +6;0.5;0;1.4;0.057;15;26;0.99448;3.36;0.45;9.5;5 +7.5;0.51;0.02;1.7;0.084;13;31;0.99538;3.36;0.54;10.5;6 +7.5;0.51;0.02;1.7;0.084;13;31;0.99538;3.36;0.54;10.5;6 +7.5;0.51;0.02;1.7;0.084;13;31;0.99538;3.36;0.54;10.5;6 +7.6;0.54;0.02;1.7;0.085;17;31;0.99589;3.37;0.51;10.4;6 +7.5;0.51;0.02;1.7;0.084;13;31;0.99538;3.36;0.54;10.5;6 +11.5;0.42;0.48;2.6;0.077;8;20;0.99852;3.09;0.53;11;5 +8.2;0.44;0.24;2.3;0.063;10;28;0.99613;3.25;0.53;10.2;6 +6.1;0.59;0.01;2.1;0.056;5;13;0.99472;3.52;0.56;11.4;5 +7.2;0.655;0.03;1.8;0.078;7;12;0.99587;3.34;0.39;9.5;5 +7.2;0.655;0.03;1.8;0.078;7;12;0.99587;3.34;0.39;9.5;5 +6.9;0.57;0;2.8;0.081;21;41;0.99518;3.41;0.52;10.8;5 +9;0.6;0.29;2;0.069;32;73;0.99654;3.34;0.57;10;5 +7.2;0.62;0.01;2.3;0.065;8;46;0.99332;3.32;0.51;11.8;6 +7.6;0.645;0.03;1.9;0.086;14;57;0.9969;3.37;0.46;10.3;5 +7.6;0.645;0.03;1.9;0.086;14;57;0.9969;3.37;0.46;10.3;5 +7.2;0.58;0.03;2.3;0.077;7;28;0.99568;3.35;0.52;10;5 +6.1;0.32;0.25;1.8;0.086;5;32;0.99464;3.36;0.44;10.1;5 +6.1;0.34;0.25;1.8;0.084;4;28;0.99464;3.36;0.44;10.1;5 +7.3;0.43;0.24;2.5;0.078;27;67;0.99648;3.6;0.59;11.1;6 +7.4;0.64;0.17;5.4;0.168;52;98;0.99736;3.28;0.5;9.5;5 +11.6;0.475;0.4;1.4;0.091;6;28;0.99704;3.07;0.65;10.0333333333333;6 +9.2;0.54;0.31;2.3;0.112;11;38;0.99699;3.24;0.56;10.9;5 +8.3;0.85;0.14;2.5;0.093;13;54;0.99724;3.36;0.54;10.1;5 +11.6;0.475;0.4;1.4;0.091;6;28;0.99704;3.07;0.65;10.0333333333333;6 +8;0.83;0.27;2;0.08;11;63;0.99652;3.29;0.48;9.8;4 +7.2;0.605;0.02;1.9;0.096;10;31;0.995;3.46;0.53;11.8;6 +7.8;0.5;0.09;2.2;0.115;10;42;0.9971;3.18;0.62;9.5;5 +7.3;0.74;0.08;1.7;0.094;10;45;0.99576;3.24;0.5;9.8;5 +6.9;0.54;0.3;2.2;0.088;9;105;0.99725;3.25;1.18;10.5;6 +8;0.77;0.32;2.1;0.079;16;74;0.99656;3.27;0.5;9.8;6 +6.6;0.61;0;1.6;0.069;4;8;0.99396;3.33;0.37;10.4;4 +8.7;0.78;0.51;1.7;0.415;12;66;0.99623;3;1.17;9.2;5 +7.5;0.58;0.56;3.1;0.153;5;14;0.99476;3.21;1.03;11.6;6 +8.7;0.78;0.51;1.7;0.415;12;66;0.99623;3;1.17;9.2;5 +7.7;0.75;0.27;3.8;0.11;34;89;0.99664;3.24;0.45;9.3;5 +6.8;0.815;0;1.2;0.267;16;29;0.99471;3.32;0.51;9.8;3 +7.2;0.56;0.26;2;0.083;13;100;0.99586;3.26;0.52;9.9;5 +8.2;0.885;0.2;1.4;0.086;7;31;0.9946;3.11;0.46;10;5 +5.2;0.49;0.26;2.3;0.09;23;74;0.9953;3.71;0.62;12.2;6 +7.2;0.45;0.15;2;0.078;10;28;0.99609;3.29;0.51;9.9;6 +7.5;0.57;0.02;2.6;0.077;11;35;0.99557;3.36;0.62;10.8;6 +7.5;0.57;0.02;2.6;0.077;11;35;0.99557;3.36;0.62;10.8;6 +6.8;0.83;0.09;1.8;0.074;4;25;0.99534;3.38;0.45;9.6;5 +8;0.6;0.22;2.1;0.08;25;105;0.99613;3.3;0.49;9.9;5 +8;0.6;0.22;2.1;0.08;25;105;0.99613;3.3;0.49;9.9;5 +7.1;0.755;0.15;1.8;0.107;20;84;0.99593;3.19;0.5;9.5;5 +8;0.81;0.25;3.4;0.076;34;85;0.99668;3.19;0.42;9.2;5 +7.4;0.64;0.07;1.8;0.1;8;23;0.9961;3.3;0.58;9.6;5 +7.4;0.64;0.07;1.8;0.1;8;23;0.9961;3.3;0.58;9.6;5 +6.6;0.64;0.31;6.1;0.083;7;49;0.99718;3.35;0.68;10.3;5 +6.7;0.48;0.02;2.2;0.08;36;111;0.99524;3.1;0.53;9.7;5 +6;0.49;0;2.3;0.068;15;33;0.99292;3.58;0.59;12.5;6 +8;0.64;0.22;2.4;0.094;5;33;0.99612;3.37;0.58;11;5 +7.1;0.62;0.06;1.3;0.07;5;12;0.9942;3.17;0.48;9.8;5 +8;0.52;0.25;2;0.078;19;59;0.99612;3.3;0.48;10.2;5 +6.4;0.57;0.14;3.9;0.07;27;73;0.99669;3.32;0.48;9.2;5 +8.6;0.685;0.1;1.6;0.092;3;12;0.99745;3.31;0.65;9.55;6 +8.7;0.675;0.1;1.6;0.09;4;11;0.99745;3.31;0.65;9.55;5 +7.3;0.59;0.26;2;0.08;17;104;0.99584;3.28;0.52;9.9;5 +7;0.6;0.12;2.2;0.083;13;28;0.9966;3.52;0.62;10.2;7 +7.2;0.67;0;2.2;0.068;10;24;0.9956;3.42;0.72;11.1;6 +7.9;0.69;0.21;2.1;0.08;33;141;0.9962;3.25;0.51;9.9;5 +7.9;0.69;0.21;2.1;0.08;33;141;0.9962;3.25;0.51;9.9;5 +7.6;0.3;0.42;2;0.052;6;24;0.9963;3.44;0.82;11.9;6 +7.2;0.33;0.33;1.7;0.061;3;13;0.996;3.23;1.1;10;8 +8;0.5;0.39;2.6;0.082;12;46;0.9985;3.43;0.62;10.7;6 +7.7;0.28;0.3;2;0.062;18;34;0.9952;3.28;0.9;11.3;7 +8.2;0.24;0.34;5.1;0.062;8;22;0.9974;3.22;0.94;10.9;6 +6;0.51;0;2.1;0.064;40;54;0.995;3.54;0.93;10.7;6 +8.1;0.29;0.36;2.2;0.048;35;53;0.995;3.27;1.01;12.4;7 +6;0.51;0;2.1;0.064;40;54;0.995;3.54;0.93;10.7;6 +6.6;0.96;0;1.8;0.082;5;16;0.9936;3.5;0.44;11.9;6 +6.4;0.47;0.4;2.4;0.071;8;19;0.9963;3.56;0.73;10.6;6 +8.2;0.24;0.34;5.1;0.062;8;22;0.9974;3.22;0.94;10.9;6 +9.9;0.57;0.25;2;0.104;12;89;0.9963;3.04;0.9;10.1;5 +10;0.32;0.59;2.2;0.077;3;15;0.9994;3.2;0.78;9.6;5 +6.2;0.58;0;1.6;0.065;8;18;0.9966;3.56;0.84;9.4;5 +10;0.32;0.59;2.2;0.077;3;15;0.9994;3.2;0.78;9.6;5 +7.3;0.34;0.33;2.5;0.064;21;37;0.9952;3.35;0.77;12.1;7 +7.8;0.53;0.01;1.6;0.077;3;19;0.995;3.16;0.46;9.8;5 +7.7;0.64;0.21;2.2;0.077;32;133;0.9956;3.27;0.45;9.9;5 +7.8;0.53;0.01;1.6;0.077;3;19;0.995;3.16;0.46;9.8;5 +7.5;0.4;0.18;1.6;0.079;24;58;0.9965;3.34;0.58;9.4;5 +7;0.54;0;2.1;0.079;39;55;0.9956;3.39;0.84;11.4;6 +6.4;0.53;0.09;3.9;0.123;14;31;0.9968;3.5;0.67;11;4 +8.3;0.26;0.37;1.4;0.076;8;23;0.9974;3.26;0.7;9.6;6 +8.3;0.26;0.37;1.4;0.076;8;23;0.9974;3.26;0.7;9.6;6 +7.7;0.23;0.37;1.8;0.046;23;60;0.9971;3.41;0.71;12.1;6 +7.6;0.41;0.33;2.5;0.078;6;23;0.9957;3.3;0.58;11.2;5 +7.8;0.64;0;1.9;0.072;27;55;0.9962;3.31;0.63;11;5 +7.9;0.18;0.4;2.2;0.049;38;67;0.996;3.33;0.93;11.3;5 +7.4;0.41;0.24;1.8;0.066;18;47;0.9956;3.37;0.62;10.4;5 +7.6;0.43;0.31;2.1;0.069;13;74;0.9958;3.26;0.54;9.9;6 +5.9;0.44;0;1.6;0.042;3;11;0.9944;3.48;0.85;11.7;6 +6.1;0.4;0.16;1.8;0.069;11;25;0.9955;3.42;0.74;10.1;7 +10.2;0.54;0.37;15.4;0.214;55;95;1.00369;3.18;0.77;9;6 +10.2;0.54;0.37;15.4;0.214;55;95;1.00369;3.18;0.77;9;6 +10;0.38;0.38;1.6;0.169;27;90;0.99914;3.15;0.65;8.5;5 +6.8;0.915;0.29;4.8;0.07;15;39;0.99577;3.53;0.54;11.1;5 +7;0.59;0;1.7;0.052;3;8;0.996;3.41;0.47;10.3;5 +7.3;0.67;0.02;2.2;0.072;31;92;0.99566;3.32;0.68;11.0666666666667;6 +7.2;0.37;0.32;2;0.062;15;28;0.9947;3.23;0.73;11.3;7 +7.4;0.785;0.19;5.2;0.094;19;98;0.99713;3.16;0.52;9.56666666666667;6 +6.9;0.63;0.02;1.9;0.078;18;30;0.99712;3.4;0.75;9.8;5 +6.9;0.58;0.2;1.75;0.058;8;22;0.99322;3.38;0.49;11.7;5 +7.3;0.67;0.02;2.2;0.072;31;92;0.99566;3.32;0.68;11.1;6 +7.4;0.785;0.19;5.2;0.094;19;98;0.99713;3.16;0.52;9.6;6 +6.9;0.63;0.02;1.9;0.078;18;30;0.99712;3.4;0.75;9.8;5 +6.8;0.67;0;1.9;0.08;22;39;0.99701;3.4;0.74;9.7;5 +6.9;0.58;0.01;1.9;0.08;40;54;0.99683;3.4;0.73;9.7;5 +7.2;0.38;0.31;2;0.056;15;29;0.99472;3.23;0.76;11.3;8 +7.2;0.37;0.32;2;0.062;15;28;0.9947;3.23;0.73;11.3;7 +7.8;0.32;0.44;2.7;0.104;8;17;0.99732;3.33;0.78;11;7 +6.6;0.58;0.02;2;0.062;37;53;0.99374;3.35;0.76;11.6;7 +7.6;0.49;0.33;1.9;0.074;27;85;0.99706;3.41;0.58;9;5 +11.7;0.45;0.63;2.2;0.073;7;23;0.99974;3.21;0.69;10.9;6 +6.5;0.9;0;1.6;0.052;9;17;0.99467;3.5;0.63;10.9;6 +6;0.54;0.06;1.8;0.05;38;89;0.99236;3.3;0.5;10.55;6 +7.6;0.49;0.33;1.9;0.074;27;85;0.99706;3.41;0.58;9;5 +8.4;0.29;0.4;1.7;0.067;8;20;0.99603;3.39;0.6;10.5;5 +7.9;0.2;0.35;1.7;0.054;7;15;0.99458;3.32;0.8;11.9;7 +6.4;0.42;0.09;2.3;0.054;34;64;0.99724;3.41;0.68;10.4;6 +6.2;0.785;0;2.1;0.06;6;13;0.99664;3.59;0.61;10;4 +6.8;0.64;0.03;2.3;0.075;14;31;0.99545;3.36;0.58;10.4;6 +6.9;0.63;0.01;2.4;0.076;14;39;0.99522;3.34;0.53;10.8;6 +6.8;0.59;0.1;1.7;0.063;34;53;0.9958;3.41;0.67;9.7;5 +6.8;0.59;0.1;1.7;0.063;34;53;0.9958;3.41;0.67;9.7;5 +7.3;0.48;0.32;2.1;0.062;31;54;0.99728;3.3;0.65;10;7 +6.7;1.04;0.08;2.3;0.067;19;32;0.99648;3.52;0.57;11;4 +7.3;0.48;0.32;2.1;0.062;31;54;0.99728;3.3;0.65;10;7 +7.3;0.98;0.05;2.1;0.061;20;49;0.99705;3.31;0.55;9.7;3 +10;0.69;0.11;1.4;0.084;8;24;0.99578;2.88;0.47;9.7;5 +6.7;0.7;0.08;3.75;0.067;8;16;0.99334;3.43;0.52;12.6;5 +7.6;0.35;0.6;2.6;0.073;23;44;0.99656;3.38;0.79;11.1;6 +6.1;0.6;0.08;1.8;0.071;14;45;0.99336;3.38;0.54;11;5 +9.9;0.5;0.5;13.8;0.205;48;82;1.00242;3.16;0.75;8.8;5 +5.3;0.47;0.11;2.2;0.048;16;89;0.99182;3.54;0.88;13.5666666666667;7 +9.9;0.5;0.5;13.8;0.205;48;82;1.00242;3.16;0.75;8.8;5 +5.3;0.47;0.11;2.2;0.048;16;89;0.99182;3.54;0.88;13.6;7 +7.1;0.875;0.05;5.7;0.082;3;14;0.99808;3.4;0.52;10.2;3 +8.2;0.28;0.6;3;0.104;10;22;0.99828;3.39;0.68;10.6;5 +5.6;0.62;0.03;1.5;0.08;6;13;0.99498;3.66;0.62;10.1;4 +8.2;0.28;0.6;3;0.104;10;22;0.99828;3.39;0.68;10.6;5 +7.2;0.58;0.54;2.1;0.114;3;9;0.99719;3.33;0.57;10.3;4 +8.1;0.33;0.44;1.5;0.042;6;12;0.99542;3.35;0.61;10.7;5 +6.8;0.91;0.06;2;0.06;4;11;0.99592;3.53;0.64;10.9;4 +7;0.655;0.16;2.1;0.074;8;25;0.99606;3.37;0.55;9.7;5 +6.8;0.68;0.21;2.1;0.07;9;23;0.99546;3.38;0.6;10.3;5 +6;0.64;0.05;1.9;0.066;9;17;0.99496;3.52;0.78;10.6;5 +5.6;0.54;0.04;1.7;0.049;5;13;0.9942;3.72;0.58;11.4;5 +6.2;0.57;0.1;2.1;0.048;4;11;0.99448;3.44;0.76;10.8;6 +7.1;0.22;0.49;1.8;0.039;8;18;0.99344;3.39;0.56;12.4;6 +5.6;0.54;0.04;1.7;0.049;5;13;0.9942;3.72;0.58;11.4;5 +6.2;0.65;0.06;1.6;0.05;6;18;0.99348;3.57;0.54;11.95;5 +7.7;0.54;0.26;1.9;0.089;23;147;0.99636;3.26;0.59;9.7;5 +6.4;0.31;0.09;1.4;0.066;15;28;0.99459;3.42;0.7;10;7 +7;0.43;0.02;1.9;0.08;15;28;0.99492;3.35;0.81;10.6;6 +7.7;0.54;0.26;1.9;0.089;23;147;0.99636;3.26;0.59;9.7;5 +6.9;0.74;0.03;2.3;0.054;7;16;0.99508;3.45;0.63;11.5;6 +6.6;0.895;0.04;2.3;0.068;7;13;0.99582;3.53;0.58;10.8;6 +6.9;0.74;0.03;2.3;0.054;7;16;0.99508;3.45;0.63;11.5;6 +7.5;0.725;0.04;1.5;0.076;8;15;0.99508;3.26;0.53;9.6;5 +7.8;0.82;0.29;4.3;0.083;21;64;0.99642;3.16;0.53;9.4;5 +7.3;0.585;0.18;2.4;0.078;15;60;0.99638;3.31;0.54;9.8;5 +6.2;0.44;0.39;2.5;0.077;6;14;0.99555;3.51;0.69;11;6 +7.5;0.38;0.57;2.3;0.106;5;12;0.99605;3.36;0.55;11.4;6 +6.7;0.76;0.02;1.8;0.078;6;12;0.996;3.55;0.63;9.95;3 +6.8;0.81;0.05;2;0.07;6;14;0.99562;3.51;0.66;10.8;6 +7.5;0.38;0.57;2.3;0.106;5;12;0.99605;3.36;0.55;11.4;6 +7.1;0.27;0.6;2.1;0.074;17;25;0.99814;3.38;0.72;10.6;6 +7.9;0.18;0.4;1.8;0.062;7;20;0.9941;3.28;0.7;11.1;5 +6.4;0.36;0.21;2.2;0.047;26;48;0.99661;3.47;0.77;9.7;6 +7.1;0.69;0.04;2.1;0.068;19;27;0.99712;3.44;0.67;9.8;5 +6.4;0.79;0.04;2.2;0.061;11;17;0.99588;3.53;0.65;10.4;6 +6.4;0.56;0.15;1.8;0.078;17;65;0.99294;3.33;0.6;10.5;6 +6.9;0.84;0.21;4.1;0.074;16;65;0.99842;3.53;0.72;9.23333333333333;6 +6.9;0.84;0.21;4.1;0.074;16;65;0.99842;3.53;0.72;9.25;6 +6.1;0.32;0.25;2.3;0.071;23;58;0.99633;3.42;0.97;10.6;5 +6.5;0.53;0.06;2;0.063;29;44;0.99489;3.38;0.83;10.3;6 +7.4;0.47;0.46;2.2;0.114;7;20;0.99647;3.32;0.63;10.5;5 +6.6;0.7;0.08;2.6;0.106;14;27;0.99665;3.44;0.58;10.2;5 +6.5;0.53;0.06;2;0.063;29;44;0.99489;3.38;0.83;10.3;6 +6.9;0.48;0.2;1.9;0.082;9;23;0.99585;3.39;0.43;9.05;4 +6.1;0.32;0.25;2.3;0.071;23;58;0.99633;3.42;0.97;10.6;5 +6.8;0.48;0.25;2;0.076;29;61;0.9953;3.34;0.6;10.4;5 +6;0.42;0.19;2;0.075;22;47;0.99522;3.39;0.78;10;6 +6.7;0.48;0.08;2.1;0.064;18;34;0.99552;3.33;0.64;9.7;5 +6.8;0.47;0.08;2.2;0.064;18;38;0.99553;3.3;0.65;9.6;6 +7.1;0.53;0.07;1.7;0.071;15;24;0.9951;3.29;0.66;10.8;6 +7.9;0.29;0.49;2.2;0.096;21;59;0.99714;3.31;0.67;10.1;6 +7.1;0.69;0.08;2.1;0.063;42;52;0.99608;3.42;0.6;10.2;6 +6.6;0.44;0.09;2.2;0.063;9;18;0.99444;3.42;0.69;11.3;6 +6.1;0.705;0.1;2.8;0.081;13;28;0.99631;3.6;0.66;10.2;5 +7.2;0.53;0.13;2;0.058;18;22;0.99573;3.21;0.68;9.9;6 +8;0.39;0.3;1.9;0.074;32;84;0.99717;3.39;0.61;9;5 +6.6;0.56;0.14;2.4;0.064;13;29;0.99397;3.42;0.62;11.7;7 +7;0.55;0.13;2.2;0.075;15;35;0.9959;3.36;0.59;9.7;6 +6.1;0.53;0.08;1.9;0.077;24;45;0.99528;3.6;0.68;10.3;6 +5.4;0.58;0.08;1.9;0.059;20;31;0.99484;3.5;0.64;10.2;6 +6.2;0.64;0.09;2.5;0.081;15;26;0.99538;3.57;0.63;12;5 +7.2;0.39;0.32;1.8;0.065;34;60;0.99714;3.46;0.78;9.9;5 +6.2;0.52;0.08;4.4;0.071;11;32;0.99646;3.56;0.63;11.6;6 +7.4;0.25;0.29;2.2;0.054;19;49;0.99666;3.4;0.76;10.9;7 +6.7;0.855;0.02;1.9;0.064;29;38;0.99472;3.3;0.56;10.75;6 +11.1;0.44;0.42;2.2;0.064;14;19;0.99758;3.25;0.57;10.4;6 +8.4;0.37;0.43;2.3;0.063;12;19;0.9955;3.17;0.81;11.2;7 +6.5;0.63;0.33;1.8;0.059;16;28;0.99531;3.36;0.64;10.1;6 +7;0.57;0.02;2;0.072;17;26;0.99575;3.36;0.61;10.2;5 +6.3;0.6;0.1;1.6;0.048;12;26;0.99306;3.55;0.51;12.1;5 +11.2;0.4;0.5;2;0.099;19;50;0.99783;3.1;0.58;10.4;5 +7.4;0.36;0.3;1.8;0.074;17;24;0.99419;3.24;0.7;11.4;8 +7.1;0.68;0;2.3;0.087;17;26;0.99783;3.45;0.53;9.5;5 +7.1;0.67;0;2.3;0.083;18;27;0.99768;3.44;0.54;9.4;5 +6.3;0.68;0.01;3.7;0.103;32;54;0.99586;3.51;0.66;11.3;6 +7.3;0.735;0;2.2;0.08;18;28;0.99765;3.41;0.6;9.4;5 +6.6;0.855;0.02;2.4;0.062;15;23;0.99627;3.54;0.6;11;6 +7;0.56;0.17;1.7;0.065;15;24;0.99514;3.44;0.68;10.55;7 +6.6;0.88;0.04;2.2;0.066;12;20;0.99636;3.53;0.56;9.9;5 +6.6;0.855;0.02;2.4;0.062;15;23;0.99627;3.54;0.6;11;6 +6.9;0.63;0.33;6.7;0.235;66;115;0.99787;3.22;0.56;9.5;5 +7.8;0.6;0.26;2;0.08;31;131;0.99622;3.21;0.52;9.9;5 +7.8;0.6;0.26;2;0.08;31;131;0.99622;3.21;0.52;9.9;5 +7.8;0.6;0.26;2;0.08;31;131;0.99622;3.21;0.52;9.9;5 +7.2;0.695;0.13;2;0.076;12;20;0.99546;3.29;0.54;10.1;5 +7.2;0.695;0.13;2;0.076;12;20;0.99546;3.29;0.54;10.1;5 +7.2;0.695;0.13;2;0.076;12;20;0.99546;3.29;0.54;10.1;5 +6.7;0.67;0.02;1.9;0.061;26;42;0.99489;3.39;0.82;10.9;6 +6.7;0.16;0.64;2.1;0.059;24;52;0.99494;3.34;0.71;11.2;6 +7.2;0.695;0.13;2;0.076;12;20;0.99546;3.29;0.54;10.1;5 +7;0.56;0.13;1.6;0.077;25;42;0.99629;3.34;0.59;9.2;5 +6.2;0.51;0.14;1.9;0.056;15;34;0.99396;3.48;0.57;11.5;6 +6.4;0.36;0.53;2.2;0.23;19;35;0.9934;3.37;0.93;12.4;6 +6.4;0.38;0.14;2.2;0.038;15;25;0.99514;3.44;0.65;11.1;6 +7.3;0.69;0.32;2.2;0.069;35;104;0.99632;3.33;0.51;9.5;5 +6;0.58;0.2;2.4;0.075;15;50;0.99467;3.58;0.67;12.5;6 +5.6;0.31;0.78;13.9;0.074;23;92;0.99677;3.39;0.48;10.5;6 +7.5;0.52;0.4;2.2;0.06;12;20;0.99474;3.26;0.64;11.8;6 +8;0.3;0.63;1.6;0.081;16;29;0.99588;3.3;0.78;10.8;6 +6.2;0.7;0.15;5.1;0.076;13;27;0.99622;3.54;0.6;11.9;6 +6.8;0.67;0.15;1.8;0.118;13;20;0.9954;3.42;0.67;11.3;6 +6.2;0.56;0.09;1.7;0.053;24;32;0.99402;3.54;0.6;11.3;5 +7.4;0.35;0.33;2.4;0.068;9;26;0.9947;3.36;0.6;11.9;6 +6.2;0.56;0.09;1.7;0.053;24;32;0.99402;3.54;0.6;11.3;5 +6.1;0.715;0.1;2.6;0.053;13;27;0.99362;3.57;0.5;11.9;5 +6.2;0.46;0.29;2.1;0.074;32;98;0.99578;3.33;0.62;9.8;5 +6.7;0.32;0.44;2.4;0.061;24;34;0.99484;3.29;0.8;11.6;7 +7.2;0.39;0.44;2.6;0.066;22;48;0.99494;3.3;0.84;11.5;6 +7.5;0.31;0.41;2.4;0.065;34;60;0.99492;3.34;0.85;11.4;6 +5.8;0.61;0.11;1.8;0.066;18;28;0.99483;3.55;0.66;10.9;6 +7.2;0.66;0.33;2.5;0.068;34;102;0.99414;3.27;0.78;12.8;6 +6.6;0.725;0.2;7.8;0.073;29;79;0.9977;3.29;0.54;9.2;5 +6.3;0.55;0.15;1.8;0.077;26;35;0.99314;3.32;0.82;11.6;6 +5.4;0.74;0.09;1.7;0.089;16;26;0.99402;3.67;0.56;11.6;6 +6.3;0.51;0.13;2.3;0.076;29;40;0.99574;3.42;0.75;11;6 +6.8;0.62;0.08;1.9;0.068;28;38;0.99651;3.42;0.82;9.5;6 +6.2;0.6;0.08;2;0.09;32;44;0.9949;3.45;0.58;10.5;5 +5.9;0.55;0.1;2.2;0.062;39;51;0.99512;3.52;0.76;11.2;6 +6.3;0.51;0.13;2.3;0.076;29;40;0.99574;3.42;0.75;11;6 +5.9;0.645;0.12;2;0.075;32;44;0.99547;3.57;0.71;10.2;5 +6;0.31;0.47;3.6;0.067;18;42;0.99549;3.39;0.66;11;6 diff --git a/one_md_per_day_format/piscine/Week1/data/D01/ex8/winequality.names b/one_md_per_day_format/piscine/Week1/data/D01/ex8/winequality.names new file mode 100644 index 0000000..4e1de1f --- /dev/null +++ b/one_md_per_day_format/piscine/Week1/data/D01/ex8/winequality.names @@ -0,0 +1,72 @@ +Citation Request: + This dataset is public available for research. The details are described in [Cortez et al., 2009]. + Please include this citation if you plan to use this database: + + P. Cortez, A. Cerdeira, F. Almeida, T. Matos and J. Reis. + Modeling wine preferences by data mining from physicochemical properties. + In Decision Support Systems, Elsevier, 47(4):547-553. ISSN: 0167-9236. + + Available at: [@Elsevier] http://dx.doi.org/10.1016/j.dss.2009.05.016 + [Pre-press (pdf)] http://www3.dsi.uminho.pt/pcortez/winequality09.pdf + [bib] http://www3.dsi.uminho.pt/pcortez/dss09.bib + +1. Title: Wine Quality + +2. Sources + Created by: Paulo Cortez (Univ. Minho), Antonio Cerdeira, Fernando Almeida, Telmo Matos and Jose Reis (CVRVV) @ 2009 + +3. Past Usage: + + P. Cortez, A. Cerdeira, F. Almeida, T. Matos and J. Reis. + Modeling wine preferences by data mining from physicochemical properties. + In Decision Support Systems, Elsevier, 47(4):547-553. ISSN: 0167-9236. + + In the above reference, two datasets were created, using red and white wine samples. + The inputs include objective tests (e.g. PH values) and the output is based on sensory data + (median of at least 3 evaluations made by wine experts). Each expert graded the wine quality + between 0 (very bad) and 10 (very excellent). Several data mining methods were applied to model + these datasets under a regression approach. The support vector machine model achieved the + best results. Several metrics were computed: MAD, confusion matrix for a fixed error tolerance (T), + etc. Also, we plot the relative importances of the input variables (as measured by a sensitivity + analysis procedure). + +4. Relevant Information: + + The two datasets are related to red and white variants of the Portuguese "Vinho Verde" wine. + For more details, consult: http://www.vinhoverde.pt/en/ or the reference [Cortez et al., 2009]. + Due to privacy and logistic issues, only physicochemical (inputs) and sensory (the output) variables + are available (e.g. there is no data about grape types, wine brand, wine selling price, etc.). + + These datasets can be viewed as classification or regression tasks. + The classes are ordered and not balanced (e.g. there are munch more normal wines than + excellent or poor ones). Outlier detection algorithms could be used to detect the few excellent + or poor wines. Also, we are not sure if all input variables are relevant. So + it could be interesting to test feature selection methods. + +5. Number of Instances: red wine - 1599; white wine - 4898. + +6. Number of Attributes: 11 + output attribute + + Note: several of the attributes may be correlated, thus it makes sense to apply some sort of + feature selection. + +7. Attribute information: + + For more information, read [Cortez et al., 2009]. + + Input variables (based on physicochemical tests): + 1 - fixed acidity + 2 - volatile acidity + 3 - citric acid + 4 - residual sugar + 5 - chlorides + 6 - free sulfur dioxide + 7 - total sulfur dioxide + 8 - density + 9 - pH + 10 - sulphates + 11 - alcohol + Output variable (based on sensory data): + 12 - quality (score between 0 and 10) + +8. Missing Attribute Values: None diff --git a/one_md_per_day_format/piscine/Week1/data/D01/ex6/model_forecasts.txt b/one_md_per_day_format/piscine/Week1/data/D01/ex9/model_forecasts.txt similarity index 100% rename from one_md_per_day_format/piscine/Week1/data/D01/ex6/model_forecasts.txt rename to one_md_per_day_format/piscine/Week1/data/D01/ex9/model_forecasts.txt diff --git a/one_md_per_day_format/piscine/Week1/day1.md b/one_md_per_day_format/piscine/Week1/day1.md index 872ff54..b222164 100644 --- a/one_md_per_day_format/piscine/Week1/day1.md +++ b/one_md_per_day_format/piscine/Week1/day1.md @@ -327,7 +327,7 @@ This question is validated if, without having used a for loop or having filled t The goal of this exercise is to learn to perform a basic data analysis on real data using NumPy. -The data set that will be used for this exercise is the wine data set. +The data set that will be used for this exercise is the red wine data set. https://archive.ics.uci.edu/ml/datasets/wine+quality @@ -341,7 +341,7 @@ How to tell if a given 2D array has null columns? 4. What is the average % of alcohol on all wines in the data set ? If needed, drop `np.nan` values -5. Compute the minimum, the maximum, the 25th percentile, the 50th percentile, the 75 percentile, the median of the pH +5. Compute the minimum, the maximum, the 25th percentile, the 50th percentile, the 75th percentile, the median (50th percentile) of the pH 6. Compute the average quality of the wines having the 20% least sulphates @@ -383,7 +383,7 @@ This slicing gives the answer `my_data[[1,6,11],:]`. > *Note: Using `percentile` or `median` may give different results depending on the duplicate values in the column. If you do not have my results please use `percentile`.* -6. This question is validated if the answer is `5.222222222222222`. The first step is to get the percentile 20% of the column `sulphates`, then create a boolean array that contains `True` of the value is smaller than the percentile 20%, then select this rows with the column quality and compute the `mean`. +6. This question is validated if the answer is ~`5.2`. The first step is to get the percentile 20% of the column `sulphates`, then create a boolean array that contains `True` of the value is smaller than the percentile 20%, then select this rows with the column quality and compute the `mean`. 7. This question is validated if the output for the best wines is: @@ -429,6 +429,8 @@ The expected output is: **Usage of for loop is not allowed, you may need to use the library** `itertools` **to create permutations** +https://docs.python.org/3.9/library/itertools.html + ## Correction This exercise is validated if the output is: From e6827de0d7d874e802fc554a9e5bd428166902a7 Mon Sep 17 00:00:00 2001 From: "b.ghazlane" Date: Thu, 8 Apr 2021 00:42:54 +0200 Subject: [PATCH 4/4] fix: exercice to exercise --- one_md_per_day_format/piscine/Week1/day1.md | 18 ++++----- one_md_per_day_format/piscine/Week1/day2.md | 24 +++++------ one_md_per_day_format/piscine/Week1/day3.md | 24 +++++------ one_md_per_day_format/piscine/Week1/day4.md | 26 ++++++------ one_md_per_day_format/piscine/Week1/day5.md | 20 +++++----- one_md_per_day_format/piscine/Week2/day03.md | 32 +++++++-------- one_md_per_day_format/piscine/Week2/day05.md | 22 +++++----- one_md_per_day_format/piscine/Week2/day1.md | 22 +++++----- one_md_per_day_format/piscine/Week2/day2.md | 28 ++++++------- one_md_per_day_format/piscine/Week2/day4.md | 28 ++++++------- .../piscine/Week2/template.md | 10 ++--- .../piscine/Week3/template.md | 10 ++--- .../piscine/Week3/w3day02.md | 18 ++++----- .../piscine/Week3/w3day03.md | 22 +++++----- .../piscine/Week3/w3day04.md | 40 +++++++++---------- .../piscine/Week3/w3day05.md | 24 +++++------ one_md_per_day_format/piscine/Week3/w3day1.md | 20 +++++----- 17 files changed, 194 insertions(+), 194 deletions(-) diff --git a/one_md_per_day_format/piscine/Week1/day1.md b/one_md_per_day_format/piscine/Week1/day1.md index b222164..d54535a 100644 --- a/one_md_per_day_format/piscine/Week1/day1.md +++ b/one_md_per_day_format/piscine/Week1/day1.md @@ -29,7 +29,7 @@ Save one notebook per day or one per exercise. Use markdown to divide your noteb - https://numpy.org/doc/ - https://jakevdp.github.io/PythonDataScienceHandbook/ -# Exercice 1 Your first NumPy array +# Exercise 1 Your first NumPy array The goal of this exercise is to use many Python data types in **NumPy** arrays. **NumPy** arrays are intensively used in **NumPy** and **Pandas**. They are flexible and allow to use optimized **NumPy** underlying functions. @@ -71,7 +71,7 @@ for i in your_np_array: --- -# Exercice 2 Zeros +# Exercise 2 Zeros The goal of this exercise is to learn to create a NumPy array with 0s. @@ -86,7 +86,7 @@ The goal of this exercise is to learn to create a NumPy array with 0s. --- -# Exercice 3 Slicing +# Exercise 3 Slicing The goal of this exercise is to learn NumPy indexing/slicing. It allows to access values of the NumPy array efficiently and without a for loop. @@ -113,7 +113,7 @@ integers[mask] = 0 --- -# Exercice 4 Random +# Exercise 4 Random The goal of this exercise is to learn to generate random data. In Data Science it is extremely useful to generate random data for many reasons: @@ -174,7 +174,7 @@ For this exercise, as the results may change depending on the version of the pac --- -# Exercice 5: Split, concatenate, reshape arrays +# Exercise 5: Split, concatenate, reshape arrays The goal of this exercise is to learn to concatenate and reshape arrays. @@ -214,7 +214,7 @@ https://jakevdp.github.io/PythonDataScienceHandbook/ (section: The Basics of Num --- -# Exercice 6: Broadcasting and Slicing +# Exercise 6: Broadcasting and Slicing The goal of this exercise is to learn to access values of n-dimensional arrays efficiently. @@ -266,7 +266,7 @@ https://jakevdp.github.io/PythonDataScienceHandbook/ (section: Computation on Ar --- -# Exercice 7: NaN +# Exercise 7: NaN The goal of this exercise is to learn to deal with missing data in NumPy and to manipulate NumPy arrays. @@ -323,7 +323,7 @@ This question is validated if, without having used a for loop or having filled t --- -# Exercice 8: Wine +# Exercise 8: Wine The goal of this exercise is to learn to perform a basic data analysis on real data using NumPy. @@ -404,7 +404,7 @@ This can be done in three steps: Get the max, create a boolean mask that indicat --- -## Exercice 9 Football tournament +## Exercise 9 Football tournament The goal of this exercise is to learn to use permutations, complex diff --git a/one_md_per_day_format/piscine/Week1/day2.md b/one_md_per_day_format/piscine/Week1/day2.md index 429d848..12dc8fb 100644 --- a/one_md_per_day_format/piscine/Week1/day2.md +++ b/one_md_per_day_format/piscine/Week1/day2.md @@ -17,7 +17,7 @@ Not only is the Pandas library a central component of the data science toolkit b Pandas is built on top of the NumPy package, meaning a lot of the structure of NumPy is used or replicated in Pandas. Data in pandas is often used to feed statistical analysis in SciPy, plotting functions from Matplotlib, and machine learning algorithms in Scikit-learn. -Most of the topics we will cover today are explained and describes with examples in the first ressource. The number of exercices is low on purpose: Take the time to understand the chapter 5 of the ressource, even if there are 40 pages. +Most of the topics we will cover today are explained and describes with examples in the first ressource. The number of exercises is low on purpose: Take the time to understand the chapter 5 of the ressource, even if there are 40 pages. The version of Pandas I used is '1.0.1'. @@ -41,9 +41,9 @@ https://pandas.pydata.org/Pandas_Cheat_Sheet.pdf https://www.learndatasci.com/tutorials/python-pandas-tutorial-complete-introduction-for-beginners/ https://jakevdp.github.io/PythonDataScienceHandbook/03.04-missing-values.html -# Exercice 1 +# Exercise 1 -The goal of this exercice is to learn to create basic Pandas objects. +The goal of this exercise is to learn to create basic Pandas objects. 1. Create a DataFrame as below this using two ways: - From a NumPy array @@ -82,9 +82,9 @@ and if the types of the first value of the columns are ``` -# Exercice 2 **Electric power consumption** +# Exercise 2 **Electric power consumption** -The goal of this exercice is to learn to manipulate real data with Pandas. +The goal of this exercise is to learn to manipulate real data with Pandas. The data set used is **Individual household electric power consumption** 1. Delete the columns `Time`, `Sub_metering_2` and `Sub_metering_3` @@ -118,7 +118,7 @@ The data set used is **Individual household electric power consumption** ## Correction: -1. `del` works but it is not a solution I recommand. For this exercice it is accepted. It is expected to use `drop` with `axis=1`. `inplace=True` may be useful to avoid to affect the result to a variable. +1. `del` works but it is not a solution I recommand. For this exercise it is accepted. It is expected to use `drop` with `axis=1`. `inplace=True` may be useful to avoid to affect the result to a variable. 2. The prefered solution is `set_index` with `inplace=True`. As long as the DataFrame returns the output below, the solution is accepted. If the type of the index is not `dtype='datetime64[ns]'` the solution is not accepted. @@ -219,9 +219,9 @@ The data set used is **Individual household electric power consumption** -# Exercice 3: E-commerce purchases +# Exercise 3: E-commerce purchases -The goal of this exercice is to learn to manipulate real data with Pandas. This exercice is less guided since the exercice 2 should have given you a nice introduction. +The goal of this exercise is to learn to manipulate real data with Pandas. This exercise is less guided since the exercise 2 should have given you a nice introduction. The data set used is **E-commerce purchases**. @@ -240,7 +240,7 @@ Questions: 12. What are the top 5 most popular email providers/hosts (e.g. gmail.com, yahoo.com, etc...) ## Correction -The validate this exercice all answers should return the expected numerical value given in the correction AND uses Pandas. For example using NumPy to compute the mean doesn't respect the philosophy of the exercice which is to use Pandas. +The validate this exercise all answers should return the expected numerical value given in the correction AND uses Pandas. For example using NumPy to compute the mean doesn't respect the philosophy of the exercise which is to use Pandas. 1. How many rows and columns are there?**10000 entries** @@ -303,9 +303,9 @@ The validate this exercice all answers should return the expected numerical valu The prefered solution is based on the usage of `apply` on a `lambda` function that slices the string that contains the email. The `lambda` function uses `split` to split the string on `@`. Finally, `value_counts` is used to count the occurences. -# Exercice 3 Handling missing values +# Exercise 3 Handling missing values -The goal of this exercice is to learn to handle missing values. In the previsous exercice we used the first techniques: filter out the missing values. We were lucky because the proportion of missing values was low. But in some cases, dropping the missing values is not possible because the filtered data set would be too small. +The goal of this exercise is to learn to handle missing values. In the previsous exercise we used the first techniques: filter out the missing values. We were lucky because the proportion of missing values was low. But in some cases, dropping the missing values is not possible because the filtered data set would be too small. This article explains the different types of missing data and how they should be handled. https://towardsdatascience.com/data-cleaning-with-python-and-pandas-detecting-missing-values-3e9c6ebcf78b " @@ -327,7 +327,7 @@ This article explains the different types of missing data and how they should be ## Correction -To validate the exercice, you should have done these two steps in that order: +To validate the exercise, you should have done these two steps in that order: - Convert the numerical columns to `float` ``` diff --git a/one_md_per_day_format/piscine/Week1/day3.md b/one_md_per_day_format/piscine/Week1/day3.md index 26fa860..ceb3b61 100644 --- a/one_md_per_day_format/piscine/Week1/day3.md +++ b/one_md_per_day_format/piscine/Week1/day3.md @@ -33,9 +33,9 @@ https://jakevdp.github.io/PythonDataScienceHandbook/05.13-kernel-density-estimat -# Exercice 1 Pandas plot 1 +# Exercise 1 Pandas plot 1 -The goal of this exercice is to learn to create plots with use Pandas. Panda's `.plot()` is a wrapper for `matplotlib.pyplot.plot()`. +The goal of this exercise is to learn to create plots with use Pandas. Panda's `.plot()` is a wrapper for `matplotlib.pyplot.plot()`. Here is the data we will be using: @@ -69,9 +69,9 @@ The plot has to contain: [logo]: images/day03/w1day03_ex1_plot1.png "Bar plot ex1" -## Exercice 2: Pandas plot 2 +## Exercise 2: Pandas plot 2 -The goal of this exercice is to learn to create plots with use Pandas. Panda's `.plot()` is a wrapper for `matplotlib.pyplot.plot()`. +The goal of this exercise is to learn to create plots with use Pandas. Panda's `.plot()` is a wrapper for `matplotlib.pyplot.plot()`. ``` @@ -108,7 +108,7 @@ You should also observe that the older people are the bigger the number of chil -## Exercice 3 Matplotlib 1 +## Exercise 3 Matplotlib 1 The goal of this plot is to learn to use Matplotlib to plot data. As you know, Matplotlib is the underlying library used by Pandas. It provides more options to plot custom visualizations. Howerver, most of the plots we will create with Matplotlib can be reproduced with Pandas' `.plot()`. @@ -145,7 +145,7 @@ The plot has to contain: [logo_ex3]: images/day03/w1day03_ex3_plot1.png "Scatter plot ex3" -# Exercice 4 Matplotlib 2 +# Exercise 4 Matplotlib 2 The goal of this plot is to learn to use Matplotlib to plot different lines in the same plot on different axis using `twinx`. This very useful to compare variables in different ranges. Here is the data: @@ -187,7 +187,7 @@ The plot has to contain: https://matplotlib.org/gallery/api/two_scales.html -# Exercice 5 Matplotlib subplots +# Exercise 5 Matplotlib subplots The goal of this exerice is to learn to use Matplotlib to create subplots. 1. Reproduce this plot using a **for loop**: @@ -224,14 +224,14 @@ The plot has to contain: Check that the plot has been created with a for loop. -# Exercice 6 Plotly 1 +# Exercise 6 Plotly 1 Plotly has evolved a lot in the previous years. It is important to **always check the documentation**. Plotly comes with a high level interface: Plotly Express. It helps building some complex plots easily. The lesson won't detail the complex examples. Plotly express is quite interesting while using Pandas Dataframes because there are some built-in functions that leverage Pandas Dataframes. The plot outputed by Plotly is interactive and can also be dynamic. -The goal of the exercice is to plot the price of a company. Its price is generated below. +The goal of the exercise is to plot the price of a company. Its price is generated below. ``` returns = np.random.randn(50) @@ -284,9 +284,9 @@ The plot has to contain: [logo_ex6]: images/day03/w1day03_ex6_plot1.png "Time series ex6" -# Exercice 7 Plotly Box plots +# Exercise 7 Plotly Box plots -The goal of this exercice is to learn to use Plotly to plot Box Plots. It is t is a method for graphically depicting groups of numerical data through their quartiles and values as min, max. It allows to compare quickly some variables. +The goal of this exercise is to learn to use Plotly to plot Box Plots. It is t is a method for graphically depicting groups of numerical data through their quartiles and values as min, max. It allows to compare quickly some variables. Let us generate 3 random arrays from a normal distribution. And for each array add respectively 1, 2 to the normal distribution. @@ -295,7 +295,7 @@ y0 = np.random.randn(50) y1 = np.random.randn(50) + 1 # shift mean y2 = np.random.randn(50) + 2 ``` -1. Plot in the same Figure 2 box plots as shown in the image. In this exercice the style is not important. +1. Plot in the same Figure 2 box plots as shown in the image. In this exercise the style is not important. ![alt text][logo_ex7] diff --git a/one_md_per_day_format/piscine/Week1/day4.md b/one_md_per_day_format/piscine/Week1/day4.md index c80414d..2ff98d9 100644 --- a/one_md_per_day_format/piscine/Week1/day4.md +++ b/one_md_per_day_format/piscine/Week1/day4.md @@ -25,9 +25,9 @@ https://towardsdatascience.com/different-ways-to-iterate-over-rows-in-a-pandas-d -# Exercice 1 Concatenate +# Exercise 1 Concatenate -The goal of this exercice is to learn to concatenate DataFrames. The logic is the same for the Series. +The goal of this exercise is to learn to concatenate DataFrames. The logic is the same for the Series. Here are the two DataFrames to concatenate: @@ -55,9 +55,9 @@ df2 = pd.DataFrame([['c', 1], ['d', 2]], | 3 | d | 2 | -# Exercice 2 Merge +# Exercise 2 Merge -The goal of this exercice is to learn to merge DataFrames +The goal of this exercise is to learn to merge DataFrames The logic of merging DataFrames in Pandas is quite similar as the one used in SQL. Here are the two DataFrames to merge: @@ -125,9 +125,9 @@ df2 = pd.DataFrame(df2_dict, columns = ['id', 'Feature1', 'Feature2']) Note: Check that the suffixes are set using the suffix parameters rather than manually changing the columns' name. -## Exercice 3 Merge MultiIndex +## Exercise 3 Merge MultiIndex -The goal of this exercice is to learn to merge DataFrames with MultiIndex. +The goal of this exercise is to learn to merge DataFrames with MultiIndex. Use the code below to generate the DataFrames. `market_data` contains fake market data. In finance, the market is available during the trading days (business days). `alternative_data` contains fake alternative data from social media. This data is available every day. But, for some reasons the Data Engineer lost the last 15 days of alternative data. 1. Using `market_data` as the reference, merge `alternative_data` on `market_data` @@ -182,9 +182,9 @@ One of the answers that returns the correct DataFrame is: 2. This question is validated if the number of missing in the DataFrame is equal to 0 and if `filled_df.sum().sum() == merged_df.sum().sum()` gives: `True` -# Exercice 4 Groupby Apply +# Exercise 4 Groupby Apply -The goal of this exercice is to learn to group the data and apply a function on the groups. +The goal of this exercise is to learn to group the data and apply a function on the groups. The use case we will work on is computing 1. Create a function that uses `pandas.DataFrame.clip` and that replace extreme values by a given percentile. The values that are greater than the upper percentile 80% are replaced by the percentile 80%. The values that are smaller than the lower percentile 20% are replaced by the percentile 20%. This process that correct outliers is called **winsorizing**. @@ -251,7 +251,7 @@ I recommend to use NumPy to compute the percentiles to make sure we used the sam ## Correction -The for loop is forbidden in this exercice. The goal is to use `groupby` and `apply`. +The for loop is forbidden in this exercise. The goal is to use `groupby` and `apply`. 1. This question is validated if the output is: @@ -315,9 +315,9 @@ https://towardsdatascience.com/how-to-use-the-split-apply-combine-strategy-in-pa -# Exercice 5 Groupby Agg +# Exercise 5 Groupby Agg -The goal of this exercice is to learn to compute different type of agregations on the groups. This small DataFrame contains products and prices. +The goal of this exercise is to learn to compute different type of agregations on the groups. This small DataFrame contains products and prices. | | value | product | |---:|--------:|:-------------| @@ -353,9 +353,9 @@ Note: The columns don't have to be MultiIndex My answer is: `df.groupby('product').agg({'value':['min','max','mean']})` -# Exercice 6 Unstack +# Exercise 6 Unstack -The goal of this exercice is to learn to unstack a MultiIndex. +The goal of this exercise is to learn to unstack a MultiIndex. Let's assume we trained a machine learning model that predicts a daily score on the companies (tickers) below. It may be very useful to unstack the MultiIndex: plot the time series, vectorize the backtest etc ... ``` diff --git a/one_md_per_day_format/piscine/Week1/day5.md b/one_md_per_day_format/piscine/Week1/day5.md index b806bf3..dc48437 100644 --- a/one_md_per_day_format/piscine/Week1/day5.md +++ b/one_md_per_day_format/piscine/Week1/day5.md @@ -31,9 +31,9 @@ https://pandas.pydata.org/Pandas_Cheat_Sheet.pdf https://www.learndatasci.com/tutorials/python-pandas-tutorial-complete-introduction-for-beginners/ -# Exercice 1 +# Exercise 1 -The goal of this exercice is to learn to manipulate time series in Pandas. +The goal of this exercise is to learn to manipulate time series in Pandas. 1. Create a `Series` named `integer_series`from 1st January 2010 to 31 December 2020. At each date is associated the number of days since 1st January 2010. It starts with 0. @@ -79,9 +79,9 @@ The goal of this exercice is to learn to manipulate time series in Pandas. ``` If the `NaN` values have been dropped the solution is also accepted. The solution uses `rolling().mean()`. -# Exercice 2 +# Exercise 2 -The goal of this exercice is to learn to use Pandas on Time Series an on Financial data. +The goal of this exercise is to learn to use Pandas on Time Series an on Financial data. The data we will use is Apple stock. @@ -144,11 +144,11 @@ To get this result there are two ways: `resample` and `groupby`. There are two k Name: Open, Length: 10118, dtype: float64 ``` - The first way is to compute the return without for loop is to use `pct_change` - - The second way to compute the return without for loop is to implement the formula given in the exercice in a vectorized way. To get the value at `t-1` you can use `shift` + - The second way to compute the return without for loop is to implement the formula given in the exercise in a vectorized way. To get the value at `t-1` you can use `shift` -# Exercice 3 Multi asset returns +# Exercise 3 Multi asset returns -The goal of this exercice is to learn to compute daily returns on a DataFrame that contains many assets (multi-assets). +The goal of this exercise is to learn to compute daily returns on a DataFrame that contains many assets (multi-assets). ``` business_dates = pd.bdate_range('2021-01-01', '2021-12-31') @@ -187,9 +187,9 @@ Note: The data is generated randomly, the values you may have a different result The DataFrame contains random data. Make sure your output and the one returned by this code is based on the same DataFrame. -# Exercice 4 Backtest +# Exercise 4 Backtest -The goal of this exercice is to learn to perform a backtest in Pandas. A backtest is a tool that allows you to know how a strategy would have performed retrospectively using historical data. In this exercice we will focus on the backtesting tool and not on how to build the best strategy. +The goal of this exercise is to learn to perform a backtest in Pandas. A backtest is a tool that allows you to know how a strategy would have performed retrospectively using historical data. In this exercise we will focus on the backtesting tool and not on how to build the best strategy. We will backtest a **long only** strategy on Apple Inc. Long only means that we only consider buying the stock. The input signal at date d says if the close price will increase at d+1. We assume that the input signal is available before the market closes. @@ -266,7 +266,7 @@ My results can be reproduced using: `np.random.seed = 2712`. Given the versions Name: Daily_futur_returns, Length: 10118, dtype: float64 ``` - The answer is also accepted if the returns is computed as in the exercice 2 and then shifted in the futur using `shift`, but I do not recommend this implementation as it adds missing values ! + The answer is also accepted if the returns is computed as in the exercise 2 and then shifted in the futur using `shift`, but I do not recommend this implementation as it adds missing values ! An example of solution is: diff --git a/one_md_per_day_format/piscine/Week2/day03.md b/one_md_per_day_format/piscine/Week2/day03.md index dea5725..bacfe11 100644 --- a/one_md_per_day_format/piscine/Week2/day03.md +++ b/one_md_per_day_format/piscine/Week2/day03.md @@ -35,9 +35,9 @@ This object takes as input the preprocessing transforms and a Machine Learning m ## Ressources TODO -# Exercice 1 Imputer 1 +# Exercise 1 Imputer 1 -The goal of this exercice is to learn how to use an Imputer to fill missing values on basic example. +The goal of this exercise is to learn how to use an Imputer to fill missing values on basic example. ``` train_data = [[7, 6, 5], @@ -84,11 +84,11 @@ test_data = [[np.nan, 1, 2], [ 4., 2., 4.]]) ``` -# Exercice 2 Scaler +# Exercise 2 Scaler -The goal of this exercice is to learn to scale a data set. There are various scaling techniques, we will focus on `StandardScaler` from scikit learn. +The goal of this exercise is to learn to scale a data set. There are various scaling techniques, we will focus on `StandardScaler` from scikit learn. -We will use a tiny data set for this exercice that we will generate by ourselves: +We will use a tiny data set for this exercise that we will generate by ourselves: ``` X_train = np.array([[ 1., -1., 2.], @@ -140,8 +140,8 @@ array([[ 1.22474487, -1.22474487, 0.53452248], [ 0. , 1.22474487, 0.53452248]]) ``` -# Exercice 3 One hot Encoder -The goal of this exercice is to learn how to deal with Categorical variables using the OneHot Encoder. +# Exercise 3 One hot Encoder +The goal of this exercise is to learn how to deal with Categorical variables using the OneHot Encoder. ``` X_train = [['Python'], ['Java'], ['Java'], ['C++']] @@ -199,8 +199,8 @@ https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.OneHotEn -# Exercice 4 Ordinal Encoder -The goal of this exercice is to learn how to deal with Categorical variables using the Ordinal Encoder. +# Exercise 4 Ordinal Encoder +The goal of this exercise is to learn how to deal with Categorical variables using the Ordinal Encoder. In that case, we want the model to consider that: **good > neutral > bad** @@ -242,9 +242,9 @@ array([[2.], -# Exercice 5 Categorical variables +# Exercise 5 Categorical variables -The goal of this exercice is to learn how to deal with Categorical variables with Ordinal Encoder, Label Encoder and OneHot Encoder. +The goal of this exercise is to learn how to deal with Categorical variables with Ordinal Encoder, Label Encoder and OneHot Encoder. Preliminary: - Load the breast-cancer.csv file @@ -359,7 +359,7 @@ AttributeError: Transformer ordinalencoder (type OrdinalEncoder) does not provid ``` -**It means that if you want to use the Ordinal Encoder, you will have to create a variable that contains the columns name in the right order. This step is not required in that exercice** +**It means that if you want to use the Ordinal Encoder, you will have to create a variable that contains the columns name in the right order. This step is not required in that exercise** @@ -438,9 +438,9 @@ array([[1., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0., 1., 0., 2., 2., 0., ``` -# Exercice 6 Pipeline +# Exercise 6 Pipeline -The goal of this exercice is to learn to use the Scikit-learn object: Pipeline. The data set: used for this exercice is the `iris` data set. +The goal of this exercise is to learn to use the Scikit-learn object: Pipeline. The data set: used for this exercise is the `iris` data set. Preliminary: - Run the code below. @@ -513,9 +513,9 @@ On financial data set, the ratio signal to noise is low. Trying to forecast stoc -# Exercice 1 Imputer 2 +# Exercise 1 Imputer 2 -The goal of this exercice is to learn how to use an Imputer to fill missing values in the data set. +The goal of this exercise is to learn how to use an Imputer to fill missing values in the data set. **Reminder**: The data exploration should be done first. It tells which rows/variables should be removed because there are too many missing values. Then the remaining data points can be treated using an Imputer. diff --git a/one_md_per_day_format/piscine/Week2/day05.md b/one_md_per_day_format/piscine/Week2/day05.md index bd6a2ef..a71ebb9 100644 --- a/one_md_per_day_format/piscine/Week2/day05.md +++ b/one_md_per_day_format/piscine/Week2/day05.md @@ -6,12 +6,12 @@ # Introduction -If you finished yesterday's exercices you should be able to train several Machine Learning algorithms and to choose one returned by GridSearchCV. +If you finished yesterday's exercises you should be able to train several Machine Learning algorithms and to choose one returned by GridSearchCV. GridSearchCV returns the model that gives the best score on the test set. Yesterday, as I told you, I changed the **cv** parameter to compute the GridSearch with a train set and a test set. It means that the selected model is based on one single measure. What if, by luck, we predict correctly on that section ? What if the best model is bad ? What if I could have selected a better model ? We will answer these questions today ! The topics we will cover are the one of the most important in Machine Learning. -Must read before to start the exercices: +Must read before to start the exercises: - Biais-Variance trade off; aka Underfitting/Overfitting. - https://machinelearningmastery.com/gentle-introduction-to-the-bias-variance-trade-off-in-machine-learning/ @@ -28,9 +28,9 @@ Must read before to start the exercices: ## Ressources -# Exercice 1: K-Fold +# Exercise 1: K-Fold -The goal of this exercice is to learn to use `KFold` to split the data set in a k-fold cross validation. Most of the time you won't use this function to split your data because this function is used by others as `cross_val_score` or `cross_validate` or `GridSearchCV` ... . But, this allows to understand the splitting and to create a custom one if needed. +The goal of this exercise is to learn to use `KFold` to split the data set in a k-fold cross validation. Most of the time you won't use this function to split your data because this function is used by others as `cross_val_score` or `cross_validate` or `GridSearchCV` ... . But, this allows to understand the splitting and to create a custom one if needed. ``` X = np.array(np.arange(1,21).reshape(10,-1)) @@ -81,9 +81,9 @@ y = np.array(np.arange(1,11)) -# Exercice 2: Cross validation (k-fold) +# Exercise 2: Cross validation (k-fold) -The goal of this exercice is to learn how to use cross validation. After reading the articles you should be able to explain why we need to cross-validate the models. We will firstly focus on Linear Regression to reduce the computation time. We will be using `cross_validate` to run the cross validation. Note that `cross_val_score` is similar bu the `cross_validate` calculates one or more scores and timings for each CV split. +The goal of this exercise is to learn how to use cross validation. After reading the articles you should be able to explain why we need to cross-validate the models. We will firstly focus on Linear Regression to reduce the computation time. We will be using `cross_validate` to run the cross validation. Note that `cross_val_score` is similar bu the `cross_validate` calculates one or more scores and timings for each CV split. Preliminary: @@ -159,9 +159,9 @@ The model is consistent across folds: it is stable. That's a first sign that the -# Exercice 3 GridsearchCV +# Exercise 3 GridsearchCV -The goal of this exercice is to learn to use GridSearchCV to run a grid search, predict on the test set and score on the test set. +The goal of this exercise is to learn to use GridSearchCV to run a grid search, predict on the test set and score on the test set. Preliminary: @@ -250,13 +250,13 @@ WARNING: If the score used in classification is the AUC, there is one rare case -# Exercice 5 Validation curve and Learning curve +# Exercise 5 Validation curve and Learning curve -The goal of this exercice is to learn to analyse the models' performance with two tools: +The goal of this exercise is to learn to analyse the models' performance with two tools: - Validation curve - Learning curve -For this exercice we will use a dataset of 100k data points to give you an idea of the computation time you can expect during projects. +For this exercise we will use a dataset of 100k data points to give you an idea of the computation time you can expect during projects. Preliminary: diff --git a/one_md_per_day_format/piscine/Week2/day1.md b/one_md_per_day_format/piscine/Week2/day1.md index ade889d..62f7fb2 100644 --- a/one_md_per_day_format/piscine/Week2/day1.md +++ b/one_md_per_day_format/piscine/Week2/day1.md @@ -51,9 +51,9 @@ https://scikit-learn.org/stable/tutorial/index.html - https://developers.google.com/machine-learning/crash-course/training-and-test-sets/video-lecture?hl=en -# Exercice 1 Scikit-learn estimator +# Exercise 1 Scikit-learn estimator -The goal of this exercice is to learn to fit a Scikit-learn estimator and use it to predict. +The goal of this exercise is to learn to fit a Scikit-learn estimator and use it to predict. ``` @@ -92,9 +92,9 @@ X, y = [[1],[2.1],[3]], [[1],[2],[3]] ``` -# Exercice 2 Linear regression in 1D +# Exercise 2 Linear regression in 1D -The goal of this exercice is to understand how the linear regression works in one dimension. To do so, we will generate a data in one dimension. Using `make regression` from Scikit-learn, generate a data set with 100 observations: +The goal of this exercise is to understand how the linear regression works in one dimension. To do so, we will generate a data in one dimension. Using `make regression` from Scikit-learn, generate a data set with 100 observations: ``` X, y, coef = make_regression(n_samples=100, @@ -162,9 +162,9 @@ array([ 83.86186727, 140.80961751, 116.3333897 , 64.52998689, 6. This question is validated if the MSE returned is `2854.2871542048706` -# Exercice 3: Train test split +# Exercise 3: Train test split -The goal of this exercice is to learn to split a data set. It is important to understand why we split the data in two sets. To put it in a nutshell: the Machine Learning algorithms learns on the training data and is evaluated on the that it hasn't seen before: the testing data. +The goal of this exercise is to learn to split a data set. It is important to understand why we split the data in two sets. To put it in a nutshell: the Machine Learning algorithms learns on the training data and is evaluated on the that it hasn't seen before: the testing data. This video gives a basic and nice explanation: https://www.youtube.com/watch?v=_vdMKioCXqQ @@ -208,10 +208,10 @@ y_test: [ 9 10] ``` -# Exercice 4 Forecast diabetes progression +# Exercise 4 Forecast diabetes progression -The goal of this exercice is to use Linear Regression to forecast the progression of diabetes. It will not always be precised, you should **ALWAYS** start doing an exploratory data analysis in order to have a good understanding of the data you model. As a reminder here an introduction to EDA: +The goal of this exercise is to use Linear Regression to forecast the progression of diabetes. It will not always be precised, you should **ALWAYS** start doing an exploratory data analysis in order to have a good understanding of the data you model. As a reminder here an introduction to EDA: https://towardsdatascience.com/exploratory-data-analysis-eda-a-practical-guide-and-template-for-structured-data-abfbf3ee3bd9 The data set used is described in https://scikit-learn.org/stable/modules/generated/sklearn.datasets.load_diabetes. @@ -300,11 +300,11 @@ https://scikit-learn.org/stable/datasets/toy_dataset.html#diabetes-dataset 4. This question is validated if the mse on the **train set** is `2888.326888` and the mse on the **test set** is `2858.255153`. -## Exercice 5 Gradient Descent +## Exercise 5 Gradient Descent -The goal of this exercice is to understand how the Linear Regression algorithm finds the optimal coefficients. +The goal of this exercise is to understand how the Linear Regression algorithm finds the optimal coefficients. -The goal is to fit a Linear Regression on a one dimensional features data **without using Scikit-learn**. Let's use the data set we generated for the exercice 1: +The goal is to fit a Linear Regression on a one dimensional features data **without using Scikit-learn**. Let's use the data set we generated for the exercise 1: ``` diff --git a/one_md_per_day_format/piscine/Week2/day2.md b/one_md_per_day_format/piscine/Week2/day2.md index ca30a5c..81744c4 100644 --- a/one_md_per_day_format/piscine/Week2/day2.md +++ b/one_md_per_day_format/piscine/Week2/day2.md @@ -31,8 +31,8 @@ More details: https://towardsdatascience.com/understanding-logistic-regression-9b02c2aec102 -For the linear regression exercices, the loss (Mean Square Error - MSE) is minimized with an algorithm called **gradient descent**. In the classification, the loss MSE can't be used because the output of the model is 0 or 1 (for binary classfication). -The **logloss** or **cross entropy** is the loss used for classification. Similarly, it has some nice mathematical properties. The minimization of the **logloss** is not covered in the exercices. However, since it is used in most machine learning models for classification, I recommand to spend some time reading the related article. This article gives a nice example of how it works: +For the linear regression exercises, the loss (Mean Square Error - MSE) is minimized with an algorithm called **gradient descent**. In the classification, the loss MSE can't be used because the output of the model is 0 or 1 (for binary classfication). +The **logloss** or **cross entropy** is the loss used for classification. Similarly, it has some nice mathematical properties. The minimization of the **logloss** is not covered in the exercises. However, since it is used in most machine learning models for classification, I recommand to spend some time reading the related article. This article gives a nice example of how it works: https://towardsdatascience.com/cross-entropy-for-classification-d98e7f974451 @@ -48,9 +48,9 @@ https://medium.com/swlh/what-is-logistic-regression-62807de62efa -# Exercice 1 Logistic regression in Scikit-learn +# Exercise 1 Logistic regression in Scikit-learn -The goal of this exercice is to learn to use Scikit-learn to classify data. +The goal of this exercise is to learn to use Scikit-learn to classify data. ``` X = [[0],[0.1],[0.2], [1],[1.1],[1.2], [1.3]] y = [0,0,0,1,1,1,0] @@ -93,9 +93,9 @@ Score: ``` -# Exercice 2 Sigmoid +# Exercise 2 Sigmoid -The goal of this exercice is to learn to compute and plot the sigmoid function. +The goal of this exercise is to learn to compute and plot the sigmoid function. 1. On the same plot, plot the sigmoid function and the custom sigmoids defined as: ``` @@ -121,9 +121,9 @@ The plot should look like this: -# Exercice 3 Decision boundary +# Exercise 3 Decision boundary -The goal of this exercice is to learn to fit a logistic regression on simple examples and to understand how the algorithm separated the data from the different classes. +The goal of this exercise is to learn to fit a logistic regression on simple examples and to understand how the algorithm separated the data from the different classes. ## 1 dimension @@ -304,9 +304,9 @@ As mentioned, it is not required to shift the class prediction to make the plot -# Exercice 4: Train test split +# Exercise 4: Train test split -The goal of this exercice is to learn to split a classification data set. The idea is the same as splitting a regression data set but there's one important detail specific to the classification: the proportion of each class in the train set and test set. +The goal of this exercise is to learn to split a classification data set. The idea is the same as splitting a regression data set but there's one important detail specific to the classification: the proportion of each class in the train set and test set. @@ -358,9 +358,9 @@ The proportion of class `1` is **0.125** in the train set and **1.** in the test 2. This question is validated if the proportion of class `1` is **0.3** for both sets. -# Exercice 5 Breast Cancer prediction +# Exercise 5 Breast Cancer prediction -The goal of this exercice is to use Logistic Regression +The goal of this exercise is to use Logistic Regression to predict breast cancer. It is always important to understand the data before training any Machine Learning algorithm. The data is described in **breast-cancer-wisconsin.names**. I suggest to add manually the column names in the DataFrame. Preliminary: @@ -439,9 +439,9 @@ array([[90, 2], As said, for some reasons, you may have slighty different results because of the data splitting. However, the values you have in the confusion matrix should be close to these results. -# Exercice 6 Multi-class (Optional) +# Exercise 6 Multi-class (Optional) -The goal of this exercice is to learn to train a classfication algorithm on a multi-class labelled data. +The goal of this exercise is to learn to train a classfication algorithm on a multi-class labelled data. Some algorithms as SVM or Logistic Regression do not natively support multi-class (more than 2 classes). There are some approaches that allow to use these algorithms on multi-class data. Let's assume we work with 3 classes: A, B and C. diff --git a/one_md_per_day_format/piscine/Week2/day4.md b/one_md_per_day_format/piscine/Week2/day4.md index 92e7878..67ce3ee 100644 --- a/one_md_per_day_format/piscine/Week2/day4.md +++ b/one_md_per_day_format/piscine/Week2/day4.md @@ -36,9 +36,9 @@ https://www.kdnuggets.com/2018/06/right-metric-evaluating-machine-learning-model https://scikit-learn.org/stable/modules/model_evaluation.html -# Exercice 1 MSE Scikit-learn +# Exercise 1 MSE Scikit-learn -The goal of this exercice is to learn to use `sklearn.metrics` to compute the mean squared error (MSE). +The goal of this exercise is to learn to use `sklearn.metrics` to compute the mean squared error (MSE). 1. Compute the MSE using `sklearn.metrics` on `y_true` and `y_pred` below: @@ -51,10 +51,10 @@ y_pred = [90, 48, 2, 2, -4] 1. This question is validated if the MSE outputted is **2.25**. -# Exercice 2 Accuracy Scikit-learn +# Exercise 2 Accuracy Scikit-learn -The goal of this exercice is to learn to use `sklearn.metrics` to compute the accuracy. +The goal of this exercise is to learn to use `sklearn.metrics` to compute the accuracy. 1. Compute the accuracy using `sklearn.metrics` on `y_true` and `y_pred` below: @@ -68,9 +68,9 @@ y_true = [0, 0, 1, 1, 1, 1, 0] -# Exercice 3 Regression +# Exercise 3 Regression -The goal of this exercice is to learn to evaluate a machine learning model using many regression metrics. +The goal of this exercise is to learn to evaluate a machine learning model using many regression metrics. Preliminary: @@ -138,13 +138,13 @@ pipe.fit(X_train, y_train) MSE on the test set: 0.5537420654727396 ``` - This result shows that the model has slightly better results on the train set than the test set. That's frequent since it is easier to get a better grade on an exam we studied than an exam that is different from what was prepared. However, the results are not good: r2 ~ 0.3. Fitting non linear models as the Random Forest on this data may improve the results. That's the goal of the exercice 5. + This result shows that the model has slightly better results on the train set than the test set. That's frequent since it is easier to get a better grade on an exam we studied than an exam that is different from what was prepared. However, the results are not good: r2 ~ 0.3. Fitting non linear models as the Random Forest on this data may improve the results. That's the goal of the exercise 5. -# Exercice 4 Classification +# Exercise 4 Classification -The goal of this exercice is to learn to evaluate a machine learning model using many classification metrics. +The goal of this exercise is to learn to evaluate a machine learning model using many classification metrics. Preliminary: @@ -232,9 +232,9 @@ Having a 99% ROC AUC is not usual. The data set we used is easy to classify. On -# Exercice 5 Machine Learning models +# Exercise 5 Machine Learning models -The goal of this exercice is to have an overview of the existing Machine Learning models and to learn to call them from scikit learn. +The goal of this exercise is to have an overview of the existing Machine Learning models and to learn to call them from scikit learn. We will focus on: - SVM/ SVC @@ -363,9 +363,9 @@ Take time to have basic understanding of the role of the basic hyperparameters a It is important to notice that the Decision Tree overfits very easily. It learns easily the training data but is not able to extrapolate on the test set. This algorithm is not used a lot. However, Random Forest and Gradient Boosting propose a solid approach to correct the overfitting (in that case the parameters `max_depth` is set to None that is why the Random Forest overfits the data). These two algorithms are used intensively in Machine Learning Projets. -# Exercice 6 Grid Search +# Exercise 6 Grid Search -The goal of this exercice is to learn how to make an exhaustive search over specified parameter values for an estimator. This is very useful because the hyperparameters which are the paremeters of the model impact the performance of the model. +The goal of this exercise is to learn how to make an exhaustive search over specified parameter values for an estimator. This is very useful because the hyperparameters which are the paremeters of the model impact the performance of the model. The scikit learn object that runs the Grid Search is called GridSearchCV. We will learn tomorrow about the cross validation. For now, let us set the parameter **cv** to `[(np.arange(18576), np.arange(18576,20640))]`. This means that GridSearchCV splits the data set in a train and test set. @@ -450,7 +450,7 @@ Ressources: return gs.best_estimator_, gs.best_params_, gs.best_score_ ``` - In my case, the gridsearch parameters are not interesting. Even if I reduced the overfitting of the Random Forest, the score on the test is lower than the score on the test returned by the Gradient Boosting in the previous exercice without optimal parameters search. + In my case, the gridsearch parameters are not interesting. Even if I reduced the overfitting of the Random Forest, the score on the test is lower than the score on the test returned by the Gradient Boosting in the previous exercise without optimal parameters search. 3. This question is validated if the code used is: diff --git a/one_md_per_day_format/piscine/Week2/template.md b/one_md_per_day_format/piscine/Week2/template.md index 290549a..56ecd88 100644 --- a/one_md_per_day_format/piscine/Week2/template.md +++ b/one_md_per_day_format/piscine/Week2/template.md @@ -16,21 +16,21 @@ ## Ressources -# Exercice 1 +# Exercise 1 -# Exercice 2 +# Exercise 2 -# Exercice 3 +# Exercise 3 -# Exercice 4 +# Exercise 4 -# Exercice 5 +# Exercise 5 diff --git a/one_md_per_day_format/piscine/Week3/template.md b/one_md_per_day_format/piscine/Week3/template.md index 290549a..56ecd88 100644 --- a/one_md_per_day_format/piscine/Week3/template.md +++ b/one_md_per_day_format/piscine/Week3/template.md @@ -16,21 +16,21 @@ ## Ressources -# Exercice 1 +# Exercise 1 -# Exercice 2 +# Exercise 2 -# Exercice 3 +# Exercise 3 -# Exercice 4 +# Exercise 4 -# Exercice 5 +# Exercise 5 diff --git a/one_md_per_day_format/piscine/Week3/w3day02.md b/one_md_per_day_format/piscine/Week3/w3day02.md index 7c0079a..e4145a3 100644 --- a/one_md_per_day_format/piscine/Week3/w3day02.md +++ b/one_md_per_day_format/piscine/Week3/w3day02.md @@ -10,7 +10,7 @@ The goal of this day is to learn to use Keras to build Neural Networks. There are two ways to build Keras models: sequential and functional. -The sequential API allows you to create models layer-by-layer for most problems. It is limited in that it does not allow you to create models that share layers or have multiple inputs or outputs. The exercices focuses on the usage of the sequential API. +The sequential API allows you to create models layer-by-layer for most problems. It is limited in that it does not allow you to create models that share layers or have multiple inputs or outputs. The exercises focuses on the usage of the sequential API. '2.4.3' @@ -25,9 +25,9 @@ A developper ## Ressources https://machinelearningmastery.com/tutorial-first-neural-network-python-keras/ -# Exercice 1 Sequential +# Exercise 1 Sequential -The goal of this exercice is to learn to call the object `Sequential`. +The goal of this exercise is to learn to call the object `Sequential`. 1. Put the object Sequential in a variable named `model` and print the variable `model`. @@ -39,9 +39,9 @@ The goal of this exercice is to learn to call the object `Sequential`. -# Exercice 2 Dense +# Exercise 2 Dense -The goal of this exercice is to learn to create layers of neurons. Keras proposes options to create custom layers. The neural networks build in these exercices do not require custom layers. `Dense` layers do the job. A dense layer is simply a layer where each unit or neuron is connected to each neuron in the next layer. As seen yesterday, there are three main types of layers: input, hidden and output. The **input layer** that specifies the number of inputs (features) is not represented as a layer in Keras. However, `Dense` has a parameter `input_dim` that gives the number of inputs in the previous layer. The output layer as any hidden layer can be created using `Dense`, the only difference is that the output layer contains one single neuron. +The goal of this exercise is to learn to create layers of neurons. Keras proposes options to create custom layers. The neural networks build in these exercises do not require custom layers. `Dense` layers do the job. A dense layer is simply a layer where each unit or neuron is connected to each neuron in the next layer. As seen yesterday, there are three main types of layers: input, hidden and output. The **input layer** that specifies the number of inputs (features) is not represented as a layer in Keras. However, `Dense` has a parameter `input_dim` that gives the number of inputs in the previous layer. The output layer as any hidden layer can be created using `Dense`, the only difference is that the output layer contains one single neuron. 1. Create a `Dense` layer with these parameters and return the output of `get_config`: @@ -121,9 +121,9 @@ The goal of this exercice is to learn to create layers of neurons. Keras propose 'bias_constraint': None} ``` -# Exercice 3 Architecture +# Exercise 3 Architecture -The goal of this exercice is to combine the layers and to create a neural network. +The goal of this exercise is to combine the layers and to create a neural network. 1. Create a neural network for regression with the following architecture and return `print(model.summary())`: @@ -145,9 +145,9 @@ The goal of this exercice is to combine the layers and to create a neural networ ``` The first two layers could use another activation function that sigmoid (eg: relu) -# Exercice 4 Optimize +# Exercise 4 Optimize -The goal of this exercice is to learn to train the neural network. Once the architecture of the neural network is set there are two steps to train the neural network: +The goal of this exercise is to learn to train the neural network. Once the architecture of the neural network is set there are two steps to train the neural network: - `compile`: The compilation step aims to set the loss function, to choose the algoithm to minimize the chosen loss function and to choose the metric the model outputs. diff --git a/one_md_per_day_format/piscine/Week3/w3day03.md b/one_md_per_day_format/piscine/Week3/w3day03.md index b6a7eeb..18aeeac 100644 --- a/one_md_per_day_format/piscine/Week3/w3day03.md +++ b/one_md_per_day_format/piscine/Week3/w3day03.md @@ -24,9 +24,9 @@ A developper https://machinelearningmastery.com/tutorial-first-neural-network-python-keras/ -# Exercice 1 Regression - Optimize +# Exercise 1 Regression - Optimize -The goal of this exercice is to learn to set up the optimization for a regression neural network. There's no code to run in that exercice. In W2D2E3, we implemented a neural network designed for regression. We will be using this neural network: +The goal of this exercise is to learn to set up the optimization for a regression neural network. There's no code to run in that exercise. In W2D2E3, we implemented a neural network designed for regression. We will be using this neural network: ``` model = keras.Sequential() @@ -68,9 +68,9 @@ https://keras.io/api/losses/regression_losses/ https://keras.io/api/metrics/regression_metrics/ -# Exercice 2 Regression example +# Exercise 2 Regression example -The goal of this exercice is to learn to train a neural network to perform a regression on a data set. +The goal of this exercise is to learn to train a neural network to perform a regression on a data set. The data set is Auto MPG Dataset and the go is to build a model to predict the fuel efficiency of late-1970s and early 1980s automobiles. To do this, provide the model with a description of many automobiles from that time period. This description includes attributes like: cylinders, displacement, horsepower, and weight. https://www.tensorflow.org/tutorials/keras/regression @@ -150,9 +150,9 @@ The output neuron has to be `Dense(1)` - by defaut the activation funtion is lin *Hint*: To get the score on the test set, `evaluate` could have been used: `model.evaluate(X_test_scaled, y_test)`. -# Exercice 3 Multi classification - Softmax +# Exercise 3 Multi classification - Softmax -The goal of this exercice is to learn to a neural network architecture for multi-class data. This is an important type of problem on which to practice with neural networks because the three class values require specialized handling. A multi-classification neural network uses as output layer a **softmax** layer. The **softmax** activation function is an extension of the sigmoid as it is designed to output the probabilities to belong to each class in a multi-class problem. This output layer has to contain as much neurons as classes in the multi-classification problem. This article explains in detail how it works. https://developers.google.com/machine-learning/crash-course/multi-class-neural-networks/softmax +The goal of this exercise is to learn to a neural network architecture for multi-class data. This is an important type of problem on which to practice with neural networks because the three class values require specialized handling. A multi-classification neural network uses as output layer a **softmax** layer. The **softmax** activation function is an extension of the sigmoid as it is designed to output the probabilities to belong to each class in a multi-class problem. This output layer has to contain as much neurons as classes in the multi-classification problem. This article explains in detail how it works. https://developers.google.com/machine-learning/crash-course/multi-class-neural-networks/softmax Let us assume we want to classify images and we know they contain either apples, bears, candies, eggs or dogs (extension of the example in the link above). @@ -175,9 +175,9 @@ Let us assume we want to classify images and we know they contain either apples, model.add(Dense(5, activation= 'softmax')) ``` -# Exercice 4 Multi classification - Optimize +# Exercise 4 Multi classification - Optimize -The goal of this exercice is to learn to optimize a multi-classification neural network. As learnt previously, the loss function used in binary classification is the log loss - also called in Keras `binary_crossentropy`. This function is defined for binary classification and can be extended to multi-classfication. In Keras, the extended loss that supports multi-classification is `binary_crossentropy`. There's no code to run in that exercice. +The goal of this exercise is to learn to optimize a multi-classification neural network. As learnt previously, the loss function used in binary classification is the log loss - also called in Keras `binary_crossentropy`. This function is defined for binary classification and can be extended to multi-classfication. In Keras, the extended loss that supports multi-classification is `binary_crossentropy`. There's no code to run in that exercise. 1. Fill the chunk of code below in order to optimize the neural network defined in the previous exercise. Choose the adapted loss, adam as optimizer and the accuracy as metric. @@ -196,9 +196,9 @@ model.compile(loss='categorical_crossentropy', metrics=['accuracy']) ``` -# Exercice 5 Multi classification example +# Exercise 5 Multi classification example -The goal of this exercice is to learn to use a neural network to classify a multiclass data set. The data set used is the Iris data set which allows to classify flower given basic features as flower's measurement. +The goal of this exercise is to learn to use a neural network to classify a multiclass data set. The data set used is the Iris data set which allows to classify flower given basic features as flower's measurement. Preliminary: - Split train test. Keep 20% for the test set. Use `random_state=1`. @@ -245,6 +245,6 @@ model.fit(X_train_sc, y_train_multi_class, epochs = 1000, batch_size=20) -# Exercice 6 GridSearch +# Exercise 6 GridSearch https://medium.com/@am.benatmane/keras-hyperparameter-tuning-using-sklearn-pipelines-grid-search-with-cross-validation-ccfc74b0ce9f diff --git a/one_md_per_day_format/piscine/Week3/w3day04.md b/one_md_per_day_format/piscine/Week3/w3day04.md index ec51e45..5711207 100644 --- a/one_md_per_day_format/piscine/Week3/w3day04.md +++ b/one_md_per_day_format/piscine/Week3/w3day04.md @@ -29,12 +29,12 @@ Les packages NLTK and Spacy to do the preprocessing ## Ressources -# Exercice 1: Lowercase +# Exercise 1: Lowercase -The goal of this exercice is to learn to lowercase text data in Python. Note that if the volume of data is low the text data can be stored in a Pandas DataFrame or Series. But, when dealing with high volumes (high but not huge), using a Pandas DataFrame or Series is not efficient. Data structures as dictionaries or list are more adapted. +The goal of this exercise is to learn to lowercase text data in Python. Note that if the volume of data is low the text data can be stored in a Pandas DataFrame or Series. But, when dealing with high volumes (high but not huge), using a Pandas DataFrame or Series is not efficient. Data structures as dictionaries or list are more adapted. ``` -list_ = ["This is my first NLP exercice", "wtf!!!!!"] +list_ = ["This is my first NLP exercise", "wtf!!!!!"] series_data = pd.Series(list_, name='text') ``` @@ -46,21 +46,21 @@ Note: Do not change the text manually ! 1. This question is validated if the output is: ``` - 0 this is my first nlp exercice + 0 this is my first nlp exercise 1 wtf!!!!! Name: text, dtype: object ``` 2. This question is validated if the output is: ``` - 0 THIS IS MY FIRST NLP EXERCICE + 0 THIS IS MY FIRST NLP EXERCISE 1 WTF!!!!! Name: text, dtype: object ``` # Exerice 2: Punctation -The goal of this exerice is to learn to deal with punctuation. In Natural Language Processing, some basic approaches as Bag of Words (exercice X) model the text as an unordered combination of words. In that case the punctuation is not always useful as it doesn't add information to the model. That is why is removed. +The goal of this exerice is to learn to deal with punctuation. In Natural Language Processing, some basic approaches as Bag of Words (exercise X) model the text as an unordered combination of words. In that case the punctuation is not always useful as it doesn't add information to the model. That is why is removed. 1. Remove the punctuation from this sentence. All characters in !"#$%&'()*+,-./:;<=>?@[\]^_`{|}~ are considered as punctuation. @@ -81,9 +81,9 @@ The goal of this exerice is to learn to deal with punctuation. In Natural Langua ``` -# Exercice 3 Tokenization +# Exercise 3 Tokenization -The goal of this exercice is to learn to tokenize as text. This step is important because it splits the text into token. A token could be a sentence or a word. +The goal of this exercise is to learn to tokenize as text. This step is important because it splits the text into token. A token could be a sentence or a word. ``` text = """Bitcoin is a cryptocurrency invented in 2008 by an unknown person or group of people using the name Satoshi Nakamoto. The currency began use in 2009 when its implementation was released as open-source software.""" @@ -152,13 +152,13 @@ https://www.analyticsvidhya.com/blog/2019/07/how-get-started-nlp-6-unique-ways-p ``` -# Exercice 4 Stop words +# Exercise 4 Stop words -The goal of this exercice is to learn to remove stop words with NLTK. Stop words usually refers to the most common words in a language. For example: "and", "is", "a" are stop words and do not add information to a sentence. +The goal of this exercise is to learn to remove stop words with NLTK. Stop words usually refers to the most common words in a language. For example: "and", "is", "a" are stop words and do not add information to a sentence. ``` text = """ -The goal of this exercice is to learn to remove stop words with NLTK. Stop words usually refers to the most common words in a language. +The goal of this exercise is to learn to remove stop words with NLTK. Stop words usually refers to the most common words in a language. """ ``` 1. Remove stop words from this sentence and return the list of work tokens without stop words. @@ -168,13 +168,13 @@ The goal of this exercice is to learn to remove stop words with NLTK. Stop word 1. This question is validated if, using NLTK, the ouptut is: ``` - ['The', 'goal', 'exercice', 'learn', 'remove', 'stop', 'words', 'NLTK', '.', 'Stop', 'words', 'usually', 'refers', 'common', 'words', 'language', '.'] + ['The', 'goal', 'exercise', 'learn', 'remove', 'stop', 'words', 'NLTK', '.', 'Stop', 'words', 'usually', 'refers', 'common', 'words', 'language', '.'] ``` -# Exercice 5 Stemming +# Exercise 5 Stemming -The goal of this exercice is to learn to use stemming using NLTK. As explained in details in the article, stemming is the process of reducing inflection in words to their root forms such as mapping a group of words to the same stem even if the stem itself is not a valid word in the Language. +The goal of this exercise is to learn to use stemming using NLTK. As explained in details in the article, stemming is the process of reducing inflection in words to their root forms such as mapping a group of words to the same stem even if the stem itself is not a valid word in the Language. Note: The output of a stemmer is a word that may not exist in the dictionnary. @@ -196,9 +196,9 @@ The interviewer interviews the president in an interview ``` -# Exercice 6: Text preprocessing +# Exercise 6: Text preprocessing -The goal of this exercice is to learn to create a function to prepocess and clean a text using NLTK. +The goal of this exercise is to learn to create a function to prepocess and clean a text using NLTK. Put this text in a variable: @@ -267,22 +267,22 @@ https://towardsdatascience.com/nlp-preprocessing-with-nltk-3c04ee00edc0 ``` -# Exercice 7: Bag of Word representation +# Exercise 7: Bag of Word representation https://machinelearningmastery.com/gentle-introduction-bag-words-model/ -The goal of this exercice is to understand how to create a Bag of Word (BoW) model on a corpus of texts. More precesily we will create a labeled data set from textual data using a word count matrix. +The goal of this exercise is to understand how to create a Bag of Word (BoW) model on a corpus of texts. More precesily we will create a labeled data set from textual data using a word count matrix. As explained in the ressource, the Bag of word reprensation makes the assumption that the order in which the words appear in a text doesn't matter. There are different types of Bag of words reprensations: - Boolean: Each document is a boolean vector - Wordcount: Each document is a word count vector -- TFIDF: Each document is a score vector. The score is detailed in the next exercice. +- TFIDF: Each document is a score vector. The score is detailed in the next exercise. The data `tweets_train.txt` contains tweets labeled with a sentiment. It gives the positivity of a tweet. Steps: -1. Preprocess the data using the function implemented in the previous exercice. And, using from `CountVectorizer` of scikitlearn with `max_features=500` compute the wordcount of the tweets. The output is a sparse matrix. +1. Preprocess the data using the function implemented in the previous exercise. And, using from `CountVectorizer` of scikitlearn with `max_features=500` compute the wordcount of the tweets. The output is a sparse matrix. - Check the shape of the word count matrix - Set **max_features** to 500 of the initial size of the dictionnary. diff --git a/one_md_per_day_format/piscine/Week3/w3day05.md b/one_md_per_day_format/piscine/Week3/w3day05.md index 45aeee4..d7d1a8a 100644 --- a/one_md_per_day_format/piscine/Week3/w3day05.md +++ b/one_md_per_day_format/piscine/Week3/w3day05.md @@ -19,9 +19,9 @@ There are many type of language models pre-trained in Spacy. Each has its specif ## Ressources -# Exercice 1 Embedding 1 +# Exercise 1 Embedding 1 -The goal of this exercice is to learn to load an embedding on SpaCy. +The goal of this exercise is to learn to load an embedding on SpaCy. 1. Install and load `en_core_web_sm` embedding. Compute the embedding of `car`. @@ -40,10 +40,10 @@ array([ 1.0522802e+00, 1.4806499e+00, 7.7402556e-01, 1.0373484e+00, ``` -# Exercice 2: Tokenization +# Exercise 2: Tokenization -The goal of this exercice is to learn to tokenize a document using Spacy. We did this using NLTK yesterday. +The goal of this exercise is to learn to tokenize a document using Spacy. We did this using NLTK yesterday. 1. Tokenize the text below and print the tokens @@ -68,9 +68,9 @@ The goal of this exercice is to learn to tokenize a document using Spacy. We did . ``` -## Exercice 3 Embeddings 2 +## Exercise 3 Embeddings 2 -The goal of this exercice is to learn to use SpaCy embedding on a document. +The goal of this exercise is to learn to use SpaCy embedding on a document. 1. Compute the embedding of all the words in this sentence. The language model considered is `en_core_web_md` @@ -106,9 +106,9 @@ https://medium.com/datadriveninvestor/cosine-similarity-cosine-distance-6571387f [logo]: w3day05ex1_plot.png "Plot" -# Exercice 4 Sentences' similarity +# Exercise 4 Sentences' similarity -The goal of this exerice is to learn to compute the similarity between two sentences. As explained in the documentation: **The word embedding of a full sentence is simply the average over all different words**. This is how `similarity` works in SpaCy. This small use case is very interesting because if we build a corpus of sentences that express an intention as **buy shoes**, then we can detect this intention and use it to propose shoes advertisement for customers. The language model used in this exercice is `en_core_web_sm`. +The goal of this exerice is to learn to compute the similarity between two sentences. As explained in the documentation: **The word embedding of a full sentence is simply the average over all different words**. This is how `similarity` works in SpaCy. This small use case is very interesting because if we build a corpus of sentences that express an intention as **buy shoes**, then we can detect this intention and use it to propose shoes advertisement for customers. The language model used in this exercise is `en_core_web_sm`. 1. Compute the similarities (3 in total) between these sentences: @@ -135,9 +135,9 @@ The goal of this exerice is to learn to compute the similarity between two sente -# Exercice 5: NER +# Exercise 5: NER -The goal of this exercice is to learn to use a Named entity recognition algorithm to detect entities. +The goal of this exercise is to learn to use a Named entity recognition algorithm to detect entities. ``` Apple Inc. is an American multinational technology company headquartered in Cupertino, California, that designs, develops, and sells consumer electronics, computer software, and online services. It is considered one of the Big Five companies in the U.S. information technology industry, along with Amazon, Google, Microsoft, and Facebook. @@ -189,9 +189,9 @@ https://en.wikipedia.org/wiki/Named-entity_recognition ``` -# Exercice 6 Part-of-speech tags +# Exercise 6 Part-of-speech tags -The goal od this exercice is to learn to use the Part-of-speech tags (**POS TAG**) using Spacy. As explained in wikipedia, the POS TAG is the process of marking up a word in a text (corpus) as corresponding to a particular part of speech, based on both its definition and its context. +The goal od this exercise is to learn to use the Part-of-speech tags (**POS TAG**) using Spacy. As explained in wikipedia, the POS TAG is the process of marking up a word in a text (corpus) as corresponding to a particular part of speech, based on both its definition and its context. Example diff --git a/one_md_per_day_format/piscine/Week3/w3day1.md b/one_md_per_day_format/piscine/Week3/w3day1.md index 03fb6a9..b2ad06e 100644 --- a/one_md_per_day_format/piscine/Week3/w3day1.md +++ b/one_md_per_day_format/piscine/Week3/w3day1.md @@ -23,9 +23,9 @@ https://srnghn.medium.com/deep-learning-overview-of-neurons-and-activation-funct Reproduire cet article sans back prop https://towardsdatascience.com/machine-learning-for-beginners-an-introduction-to-neural-networks-d49f22d238f9 -# Exercice 1 The neuron +# Exercise 1 The neuron -The goal of this exercice is to understand the role of a neuron and to implement a neuron. +The goal of this exercise is to understand the role of a neuron and to implement a neuron. An artificial neuron, the basic unit of the neural network, (also referred to as a perceptron) is a mathematical function. It takes one or more inputs that are multiplied by values called “weights” and added together. This value is then passed to a non-linear function, known as an activation function, to become the neuron’s output. @@ -91,7 +91,7 @@ https://victorzhou.com/blog/intro-to-neural-networks/ # Exerice 2 Neural network -The goal of this exercice is to understand how to combine three neurons to form a neural network. A neural newtwork is nothing else than neurons connected together. As shown in the figure the neural network is composed of **layers**: +The goal of this exercise is to understand how to combine three neurons to form a neural network. A neural newtwork is nothing else than neurons connected together. As shown in the figure the neural network is composed of **layers**: - Input layer: it only represents input data. **It doesn't contain neurons**. - Output layer: it represents the last layer. It contains a neuron (in some cases more than 1). @@ -99,7 +99,7 @@ The goal of this exercice is to understand how to combine three neurons to form Notice that the neuron **o1** in the output layer takes as input the output of the neurons **h1** and **h2** in the hidden layer. -In exercice 1, you implemented this neuron. +In exercise 1, you implemented this neuron. ![alt text][neuron] [neuron]: images/day1/ex2/w3_day1_neuron.png "Plot" @@ -143,9 +143,9 @@ Now, we add two more neurons: 1. This question is validated the output is: **0.9524917424084265** -# Exercice 3 Log loss +# Exercise 3 Log loss -The goal of this exercice is to implement the Log loss function. As mentioned last week, this function is used in classification as a **loss function**. It means that the better the classifier is, the smaller the loss function is. W2D1, you implemented the gradient descent on the MSE loss to update the weights of the linear regression. Similarly, the minimization of the Log loss leads to finding optimal weights. +The goal of this exercise is to implement the Log loss function. As mentioned last week, this function is used in classification as a **loss function**. It means that the better the classifier is, the smaller the loss function is. W2D1, you implemented the gradient descent on the MSE loss to update the weights of the linear regression. Similarly, the minimization of the Log loss leads to finding optimal weights. Log loss: - 1/n * Sum[(y_true*log(y_pred) + (1-y_true)*log(1-y_pred))] @@ -163,7 +163,7 @@ https://scikit-learn.org/stable/modules/generated/sklearn.metrics.log_loss.html 1. This question is validated if the output is: **0.5472899351247816**. -# Exercice 4 Forward propagation +# Exercise 4 Forward propagation The goal of this exerice is to compute the log loss on the output of the forward propagation. The data used is the tiny data set below. @@ -198,9 +198,9 @@ The goal if the network is to predict the success at the exam given math and che 2. This question is validated if the logloss for the 4 students is **0.5485133607757963**. -# Exercice 5 Regression +# Exercise 5 Regression -The goal of this exercice is to learn to adapt the output layer to regression. +The goal of this exercise is to learn to adapt the output layer to regression. As a reminder, one of reasons for which the sigmoid is used in classification is because it contracts the output between 0 and 1 which is the expected output range for a probability (W2D2: Logistic regression). However, the output of the regression is not a probability. In order to perform a regression using a neural network, the activation function of the neuron on the output layer has to be modified to **identity function**. In mathematics, the identity function is: **f(x) = x**. In other words it means that it returns the input as so. The three steps become: @@ -218,7 +218,7 @@ In order to perform a regression using a neural network, the activation function All other neurons' activation function **doesn't change**. -1. Adapt the neuron class implemented in exercice 1. It now takes as a parameter `regression` which is boolean. When its value is `True`, `feedforward` should use the identity function as activation function instead of the sigmoid function. +1. Adapt the neuron class implemented in exercise 1. It now takes as a parameter `regression` which is boolean. When its value is `True`, `feedforward` should use the identity function as activation function instead of the sigmoid function. ```