Compare commits

...

33 Commits

Author SHA1 Message Date
sagarishere 5ccf29d472
Merge 4fe9e03a0e into 2f7977a95a 2024-09-18 16:43:11 +03:00
Oumaima Fisaoui 2f7977a95a Chore(AI): Fix sp500 subject and audit 2024-09-09 09:58:11 +01:00
Oumaima Fisaoui 1d34ea0a71 Chore(DPxAI): Fix format 2024-09-06 11:18:31 +01:00
Oumaima Fisaoui 75472c0ed6 Chore(DPxAI): Fix format 2024-09-06 11:18:31 +01:00
Oumaima Fisaoui cccab05477 Chore(DPxAI): Fix format 2024-09-06 11:18:31 +01:00
Oumaima Fisaoui aa54ab1e66 Chore(DPxAI): Fix format 2024-09-06 11:18:31 +01:00
Oumaima Fisaoui 62486ed720 Chore(DPxAI): Fix the accuracy on test set 2024-09-06 11:18:31 +01:00
Oumaima Fisaoui 659074232f Chore(AI): Fix emotions detector 2024-09-06 11:18:31 +01:00
oumaimafisaoui 9c9adb1c88 Fix(Pipeline): Fix irradiat attribute values 2024-09-05 14:49:00 +01:00
oumaimafisaoui 00813d29e9 Fix(Pipeline): fix formatting 2024-09-05 14:49:00 +01:00
oumaimafisaoui fe5f82edcf Fix(Pipeline): fix datafile data info and example do not match 2024-09-05 14:49:00 +01:00
Harry f26da6368e
feat(template): question / potential-issue 2024-09-04 19:53:01 +01:00
Harry 4a8287754d
chore(template): change bug emoji 2024-09-04 19:36:24 +01:00
Abdelhak ELYakoubi b0d041741d
CON-2931 Resolve missing assets for `stealth-boom` project (#2705)
* CON-2931 <Refined Stealth-boom subject and audit, and provided a file that contains assets>

* CON-2931 <Prettier formatting applied>

* CON-2931 chore(stealth-boom) removed uppercase characters from link

* CON-2931 docs(stealth-boom) added scenario and added image

* CON-2931 fix(stealth-boom) added review suggestions and rephrased some poorly written sentences

* CON-2931 fix(stealth-boom) added more review changes and refactored some sentences for clarity

* CON-2931 fix(stealth-boom) removed extra '#' from all headers in audit
2024-09-04 16:32:22 +01:00
Oumaima Fisaoui 6003e18e50 Chore(DPxAI): format 2024-09-04 10:56:11 +01:00
Oumaima Fisaoui 718dabf423 Chore(DPxAI): change prompt engineering 2024-09-04 10:56:11 +01:00
Oumaimafisaoui 2b8dd0028b Chore(DPxAI): Make all AI powered learning sections unified 2024-09-04 10:56:11 +01:00
Oumaima Fisaoui c93a142538 Chore(DPxAI): Fix format 2024-09-04 09:23:50 +01:00
Oumaima Fisaoui bb5d83fa41 Chore(DPxAI): Fix format 2024-09-04 09:23:50 +01:00
Oumaima Fisaoui 2a52ae33f4 Chore(DPxAI): fix format 2024-09-04 09:23:50 +01:00
Oumaima Fisaoui 2b2c86b17b Chore(DPxAI): Fix format 2024-09-04 09:23:50 +01:00
Oumaimafisaoui c5a9980f52 Chore(DPxAI): Add play with variables 2024-09-04 09:23:50 +01:00
oumaimafisaoui 9afa861f7a Fix(DPxAI): Fix fromatting 2024-09-04 09:23:50 +01:00
oumaimafisaoui 37495c5749 Chore(DPxAI): Applied suggested changes on text formats 2024-09-04 09:23:50 +01:00
oumaimafisaoui 17c3fc112b Chore(Quest01/Ex02): formatting 2024-09-04 09:23:50 +01:00
oumaimafisaoui 63f9f4aed5 Chore(Quest01/Ex02): unify text 2024-09-04 09:23:50 +01:00
oumaimafisaoui 6c1fa67d5a Chore(Quest01/Ex02): add challenge mode and fix power variable 2024-09-04 09:23:50 +01:00
oumaimafisaoui fbb83dc02a Chore(DPxAI): add solution for play with variables 2024-09-04 09:23:50 +01:00
dependabot[bot] 3c6e198cda Chore(deps): Bump flask-cors
Bumps [flask-cors](https://github.com/corydolphin/flask-cors) from 4.0.1 to 5.0.0.
- [Release notes](https://github.com/corydolphin/flask-cors/releases)
- [Changelog](https://github.com/corydolphin/flask-cors/blob/main/CHANGELOG.md)
- [Commits](https://github.com/corydolphin/flask-cors/compare/4.0.1...5.0.0)

---
updated-dependencies:
- dependency-name: flask-cors
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-09-03 17:53:05 +01:00
Oumaimafisaoui 24f188ae79 Chore(DPxAI): Fix typo 2024-09-03 13:34:37 +01:00
Oumaimafisaoui a53b9b2654 Test(glance-on-power): changed Hello There ! to Hello There 2024-09-03 13:34:37 +01:00
oumaimafisaoui 411220450c Docs(DPxAI): Fix project instructions 2024-09-02 15:07:19 +01:00
sagarishere 4fe9e03a0e
Condition in tests but not in task description
I have added the condition that was not included in the task description but was included in the tests.
also __proto__ is  a non standard property.
It maybe good to mention as the first test usually fails just because it is not mentioned to ignore the properties from the prototype chain.
2023-01-24 15:42:28 +02:00
23 changed files with 343 additions and 181 deletions

View File

@ -1,8 +1,8 @@
---
name: 🐛 Bug report
name: 🐞 Bug report
about: Create a report to help us improve
title: "[BUG] "
labels: "🐛 bug"
labels: "🐞 bug"
assignees: ""
---

View File

@ -0,0 +1,26 @@
---
name: 🙋 Question / Potential Issue
about: Ask a question or report a potential issue that isn't clearly a bug or a feature request
title: "[QUESTION] "
labels: "🙋 question"
assignees: ""
---
**Describe your question or potential issue**
A clear and concise description of your question or the potential issue you have encountered.
**Context & Use Case**
Provide the context or the scenario in which this question or issue arises. Explain why this is important to understand or address.
**Steps taken**
List any steps you have taken to try and resolve the issue or answer the question:
1. Checked the documentation/readme...
2. Tried to reproduce the issue...
3. Searched for similar questions...
**Attachments**
If applicable, add any screenshots, logs, or additional information that could help explain your question or potential issue.
**Additional context**
Add any other details or context that might be relevant, including links to related issues or documentation.

View File

@ -12,7 +12,7 @@
"code": "const args = saveArguments(console, 'log')\n\n// Your code\n\nconst typeOfLoggedValues = args.flat().map((v) => typeof v)\nif (!typeOfLoggedValues.includes('string')) {\n throw Error('you must log a string')\n}"
},
{
"description": "Log the string Hello There ! in the console",
"code": "const args = saveArguments(console, 'log')\n\n// Your code\n\nconst loggedValues = args.flat().join(' ')\nif (!loggedValues.includes('Hello There !')) {\n throw Error('you must log the text Hello There !')\n}"
"description": "Log the string Hello There! in the console",
"code": "const args = saveArguments(console, 'log')\n\n// Your code\n\nconst loggedValues = args.flat().join(' ')\nif (!loggedValues.includes('Hello There!')) {\n throw Error('you must log the text Hello There!')\n}"
}
]

View File

@ -10,7 +10,7 @@ You can create your own functions to give your robot unique abilities and make i
### AI-Powered Learning Techniques
`Reflective Practice Technique:`
**Reflective Practice Technique:**
This type of prompt encourages you to reflect on the concepts youve just learned, reinforcing your understanding by applying the concepts in different contexts or scenarios.
Find the examples across the subject ;)

View File

@ -76,6 +76,6 @@ Don't forget to test your code before submitting it, using the `Run` button.
Videos designed to give **hints** are assigned to each quest. It is strongly suggested to watch them as you go.
### Ressources
### Resources
- [Introduction to JavaScript](https://developer.mozilla.org/en-US/docs/Learn/JavaScript/First_steps)

View File

@ -12,7 +12,7 @@ Let's find out more!
### AI-Powered Learning Techniques
`Guided Practice Technique:`
**Guided Practice Technique:**
This type of prompt encourages the AI to guide learners through hands-on practice exercises, providing immediate feedback and helping them build confidence in their skills.
Find the examples across the subject ;)

View File

@ -12,7 +12,7 @@ Let's find out more!
### AI-Powered Learning Techniques
`Code Chunking Technique:`
**Code Chunking Technique:**
This type of prompt encourages you to break down larger pieces of code into smaller, digestible chunks.

View File

@ -12,7 +12,7 @@ Let's discover them together!
### AI-Powered Learning Techniques
`Example-Based Learning Technique:`
**Example-Based Learning Technique:**
This type of prompt encourages the AI to provide concrete examples to illustrate concepts, making it easier to understand and apply them.
Find the examples across the subject ;)

View File

@ -8,7 +8,7 @@ Think about all the objects around you: like your book or robot friend. Each of
### AI-Powered Learning Techniques
`Visualization Technique:`
**Visualization Technique:**
This type of prompt encourages the AI to explain a concept using diagrams or visual representations to illustrate concepts.
Find the examples across the subject ;)

View File

@ -10,7 +10,7 @@ Let's have some fun with it!
### AI-Powered Learning Techniques
` Interactive Learning Technique:`
**Interactive Learning Technique:**
This type of prompt engages you in active problem-solving by providing challenges or tasks that require applying concepts. This can work to compare your results with your peers!
Find the examples across the subject ;)

View File

@ -0,0 +1,93 @@
## Play with variables
> Mindful AI mode
### Context
Remember that if things get a little hectic at times, take the time to get closer to your peers so that you can think, share and move forward together.
> Keep Going!
### AI-Powered Learning Techniques
**Clarification Technique:**
This type of prompt encourages the AI to explain a concept in detail, helping you gain a deeper understanding.
> Find the examples across the subject ;)
## Concepts
### Escape characters
**Quote delimiters** can be one of the tricky things to deal with.
Since they are used for delimiting text, they need a trick to include them in
our text.
For example, we want a `'` _(single quote)_ in our text, but use them as
delimiters:
```js
console.log('I keep trying , I can't give up! ')
// too bad a single quote, ruined the quote, get it ?
```
The `\` _(backslash)_ is used for that:
Every time there is an _extra special_ character into your string, putting a `\`
in front of it will **escape** it and doing so will let JS understand you meant
the **literal** following character and not the delimiter, _or whatever else
the character normally means for a string_
```js
console.log("I keep trying , I can't give up! ");
// Output: I keep trying, I can't give up!
```
#### **`Prompt Example`**:
"As a beginner, how do I include special characters in a string in JavaScript? Give me simple examples too."
### Assign re-assign
Remember the `let` keyword is used to declare new variables.
> Note that we can't have multiple variables with the same identifier otherwise JS wouldn't know which one is which.
If you redeclare a variable, it will crash!
But it is still possible to use the `=` (assignment operator) to change its value!
> Note that sometimes you may find variable declared with `const`. This means that the assignation is constant and can never be re-assigned!
> It is used to protect your code against errors, but you can always use `let` in its place..
> Also you may find online old code using var. We are trying to get rid of `var`s since 2015. It's ancient syntax and it was pretty problematic. Never use it! If you see code using it, try to find a more recent example. That one is outdated.
#### **`Prompt Example`**:
- "As a beginner, what is the difference between let and `const` in JavaScript?"
- "As a beginner, how do I reassign a value to an already declared variable in JavaScript?"
### Instructions
#### Task 1:
- Create a `escapeFromDelimiters` variable that includes all 3 quotes _(`` ` ``, `"` and
`'`)_.
- Create a `escapeTheEscape` variable that includes a backslash _(`\`)_.
#### Task 2:
- The variable `power` has been declared and will be used during the tests.
- You must try to re-assign the power variable to the string value `levelMax`. But without re-declaring it!
---
> “How did I escape? With difficulty. How did I plan this moment? With
> pleasure.” \
> ― Alexandre Dumas, The Count of Monte Cristo

View File

@ -10,8 +10,8 @@ In JavaScript, we use objects to group these properties together, making it easy
### AI-Powered Learning Techniques
`Reflective Technique:`
This type of prompt encourages the AI to help learners reflect on their understanding by asking questions and prompting them to think critically about the concepts.
**Contextual Learning Technique:**
This strategy places learning within a real-world or practical context by having the AI relate concepts to scenarios you might encounter in everyday life or your work. It helps you understand the relevance of what youre learning and how to apply it.
Find the examples across the subject ;)
@ -54,9 +54,9 @@ robot.code = undefined;
#### **`Prompt Example`**:
- "How does modifying an object's property differ from adding a new property or removing an existing one?"
- "Think about your pet. How would you use an object to keep track of its name, age, and favorite food?"
- "Can you think of a scenario where using an object to group related properties would be more beneficial than using separate variables?"
- "You have a collection of books. How could you use an object to remember the title, author, and how many pages each book has?"
### Instructions

View File

@ -1,18 +1,22 @@
## Emotions detection with Deep Learning
Cameras are everywhere. Videos and images have become one of the most interesting data sets for artificial intelligence.
Image processing is a quite broad research area, not just filtering, compression, and enhancement. Besides, we are even interested in the question, “what is in images?”, i.e., content analysis of visual inputs, which is part of the main task of computer vision.
The study of computer vision could make possible such tasks as 3D reconstruction of scenes, motion capturing, and object recognition, which are crucial for even higher-level intelligence such as image and video understanding, and motion understanding.
For this 2 months project we will
focus on two tasks:
Image processing is a quite broad research area, not just filtering, compression, and enhancement.
- emotion classification
- face tracking
Besides, we are even interested in the question, “what is in images?”, i.e., content analysis of visual inputs, which is part of the main task of computer vision.
The study of computer vision could make possible such tasks as 3D reconstruction of scenes, motion capturing, and object recognition, which are crucial for even higher-level intelligence such as image and video understanding, and motion understanding.
For this project we will focus on two tasks:
- Emotion classification
- Face tracking
With the computing power exponentially increasing the computer vision field has been developing exponentially. This is a key element because the computer power allows using more easily a type of neural networks very powerful on images:
CNN's (Convolutional Neural Networks). Before the CNNs were democratized, the algorithms used relied a lot on human analysis to extract features which obviously time-consuming and not reliable. If you're interested in the "old
school methodology" [this article](https://towardsdatascience.com/classifying-facial-emotions-via-machine-learning-5aac111932d3) explains it.
The history behind this field is fascinating! [Here](https://kapernikov.com/basic-introduction-to-computer-vision/) is a short summary of its history.
- CNN's (Convolutional Neural Networks). Before the CNNs were democratized, the algorithms used relied a lot on human analysis to extract features which obviously time-consuming and not reliable. If you're interested in the "old school methodology" [this article](https://towardsdatascience.com/classifying-facial-emotions-via-machine-learning-5aac111932d3) explains it.
- The history behind this field is fascinating! [Here](https://kapernikov.com/basic-introduction-to-computer-vision/) is a short summary of its history.
### Project goal and suggested timeline
@ -31,15 +35,18 @@ The two steps are detailed below.
### Preliminary:
- Take [this course](https://www.coursera.org/learn/convolutional-neural-networks). This course is a reference for many reasons and one of them is the creator: **Andrew Ng**. He explains the basics of CNNs but also some more advanced topics as transfer learning, siamese networks etc ...
I suggest to focus on Week 1 and 2 and to spend less time on Week 3 and 4. Don't worry the time scoping of such MOOCs are conservative. You can attend the lessons for free!
- I suggest to focus on Week 1 and 2 and to spend less time on Week 3 and 4. Don't worry the time scoping of such MOOCs are conservative. You can attend the lessons for free!
- Participate in [this challenge](https://www.kaggle.com/c/digit-recognizer/code). The MNIST dataset is a reference in computer vision. Researchers use it as a benchmark to compare their models.
Start first with a logistic regression to understand how to handle images in Python. And then train your first CNN on this data set.
- Start first with a logistic regression to understand how to handle images in Python. And then train your first CNN on this data set.
### Face emotions classification
Emotion detection is one of the most researched topics in the modern-day machine learning arena. The ability to accurately detect and identify an emotion opens up numerous doors for Advanced Human Computer Interaction.
The aim of this project is to detect up to seven distinct facial emotions in real time. This project runs on top of a Convolutional Neural Network (CNN) that is built with the help of Keras whose backend is TensorFlow in Python.
The aim of this project is to detect up to seven distinct facial emotions in real time.
This project runs on top of a Convolutional Neural Network (CNN) that is built with the help of Keras whose backend is TensorFlow in Python.
The facial emotions that can be detected and classified by this system are Happy, Sad, Angry, Surprise, Fear, Disgust and Neutral.
Your goal is to implement a program that takes as input a video stream that contains a person's face and that predicts the emotion of the person.
@ -49,10 +56,10 @@ Your goal is to implement a program that takes as input a video stream that cont
- Download and unzip the [data here](https://assets.01-edu.org/ai-branch/project3/emotions-detector.zip).
This dataset was provided for this past [Kaggle challenge](https://www.kaggle.com/competitions/challenges-in-representation-learning-facial-expression-recognition-challenge/overview).
It is possible to find more information about on the challenge page. Train a CNN on the dataset `train.csv`. Here is an [example of architecture](https://www.quora.com/What-is-the-VGG-neural-network) you can implement.
**The CNN has to perform more than 70% on the test set**. You can use the `test_with_emotions.csv` file for this. You will see that the CNNs take a lot of time to train.
**The CNN has to perform more than 60% on the test set**. You can use the `test_with_emotions.csv` file for this. You will see that the CNNs take a lot of time to train.
You don't want to overfit the neural network. I strongly suggest to use early stopping, callbacks and to monitor the training using the `TensorBoard`.
You have to save the trained model in `my_own_model.pkl` and to explain the chosen architecture in `my_own_model_architecture.txt`. Use `model.summary())` to print the architecture.
You have to save the trained model in `final_emotion_model.keras` and to explain the chosen architecture in `final_emotion_model_arch.txt`. Use `model.summary())` to print the architecture.
It is also expected that you explain the iterations and how you end up choosing your final architecture. Save a screenshot of the `TensorBoard` while the model's training in `tensorboard.png` and save a plot with the learning curves showing the model training and stopping BEFORE the model starts overfitting in `learning_curves.png`.
- Optional: Use a pre-trained CNN to improve the accuracy. You will find some huge CNN's architecture that perform well. The issue is that it is expensive to train them from scratch.
@ -86,13 +93,10 @@ project
├── environment.yml
├── README.md
├── results
│   ├── hack_cnn
│   │   ├── hacked_image.png
│   │   └── input_image.png
│   ├── model
│   │   ├── learning_curves.png
│   │   ├── my_own_model_architecture.txt
│   │   ├── my_own_model.pkl
│   │   ├── final_emotion_model_arch.txt
│   │   ├── final_emotion_model.keras
│   │   ├── pre_trained_model_architecture.txt
│   │   └── pre_trained_model.pkl
│   └── preprocessing_test
@ -101,7 +105,7 @@ project
│   ├── image_n.png
│   └── input_video.mp4
└── scripts
├── hack_the_cnn.py
|__ validation_loss_accuracy.py
├── predict_live_stream.py
├── predict.py
├── preprocess.py
@ -114,7 +118,7 @@ project
```prompt
python ./scripts/predict.py
Accuracy on test set: 72%
Accuracy on test set: 62%
```

View File

@ -24,12 +24,12 @@
###### Does the text document explain why the architecture was chosen, and what were the previous iterations?
###### Does the following command `python ./scripts/predict.py` run without any error and returns an accuracy greater than 70%?
###### Does the following command `python ./scripts/predict.py` run without any error and returns an accuracy greater than 60%?
```prompt
python ./scripts/predict.py
Accuracy on test set: 72%
Accuracy on test set: 62%
```

View File

@ -46,10 +46,8 @@ project
| | gender_submission.csv
└───notebook
│ │ EDA.ipynb
|
|───scripts
│ │ main.ipynb
```
@ -59,21 +57,19 @@ project
- `username.txt` contains the username, the last modified date of the file **has to correspond to the first day of the project**.
- `EDA.ipynb` contains the exploratory data analysis. This file should contain all steps of data analysis that contributed or not to improve the accuracy. It has to be commented so that the reviewer can understand the analysis and run it without any problem.
- `scripts` contains python file(s) that perform(s) the feature engineering, the model's training and prediction on the test set. It could also be one single Jupyter Notebook. It has to be commented to help the reviewers understand the approach and run the code without any bugs.
- `main.ipynb` This file (single Jupyter Notebook) should contain all steps of data analysis that contributed or not to improve the accuracy, the feature engineering, the model's training and prediction on the test set. It has to be commented to help the reviewers understand the approach and run the code without any bugs.
- **Submit your predictions on the Kaggle's competition platform**. Check your ranking and score in the leaderboard.
### Scores
In order to validate the project you will have to score at least **79% accuracy on the Leaderboard**:
- 79% accuracy is the minimum score to validate the project.
- 78.9% accuracy is the minimum score to validate the project.
Scores indication:
- 79% difficult - minimum required
- 81% very difficult: smart feature engineering needed
- 78.9% difficult - minimum required
- 80% very difficult: smart feature engineering needed
- More than 83%: excellent that corresponds to the top 2% on Kaggle
- More than 85%: cheating
@ -108,8 +104,6 @@ Iteration 3:
- Perform an EDA. Make assumptions and check them. Example: What if first class passengers survived more. Check the assumption through EDA and create relevant features to help the model capture the information.
- Run a gridsearch
Iteration 4:
- Good luck !

View File

@ -241,8 +241,10 @@ breast: One Hot
breast-quad: One Hot
['right_low' 'left_low' 'left_up' 'central' 'right_up']
irradiat: One Hot
['yes' 'no']
Class: Target (One Hot)
['recurrence-events' 'no-recurrence-events']
```
@ -259,16 +261,16 @@ input: ohe.transform(X_test[ohe_cols])[:10]
output:
array([[1., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0.],
[1., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0.],
[0., 1., 1., 0., 0., 1., 0., 0., 0., 0., 1.],
[0., 1., 1., 0., 0., 1., 0., 0., 0., 0., 1.],
[1., 0., 1., 0., 0., 0., 1., 0., 0., 1., 0.],
[1., 0., 1., 0., 0., 0., 0., 1., 0., 1., 0.],
[1., 0., 0., 1., 0., 0., 0., 0., 1., 1., 0.],
[1., 0., 0., 1., 0., 1., 0., 0., 0., 1., 0.],
[1., 0., 0., 1., 0., 0., 0., 1., 0., 1., 0.],
[1., 0., 0., 1., 0., 0., 1., 0., 0., 1., 0.],
[1., 0., 0., 1., 0., 0., 1., 0., 0., 1., 0.],
[1., 0., 0., 1., 0., 1., 0., 0., 0., 1., 0.],
[1., 0., 0., 1., 0., 0., 1., 0., 0., 1., 0.],
[0., 1., 1., 0., 0., 0., 1., 0., 0., 0., 1.]])
[1., 0., 1., 0., 0., 0., 0., 1., 0., 0., 1.],
[1., 0., 0., 1., 0., 1., 0., 0., 0., 1., 0.]])
input: ohe.get_feature_names(ohe_cols)
input: ohe.get_feature_names_out(ohe_cols)
output:
array(['node-caps_no', 'node-caps_yes', 'breast_left', 'breast_right',
'breast-quad_central', 'breast-quad_left_low',

View File

@ -146,14 +146,14 @@ dtype: int64
array([[1., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0.],
[1., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0.],
[0., 1., 1., 0., 0., 1., 0., 0., 0., 0., 1.],
[0., 1., 1., 0., 0., 1., 0., 0., 0., 0., 1.],
[1., 0., 1., 0., 0., 0., 1., 0., 0., 1., 0.],
[1., 0., 1., 0., 0., 0., 0., 1., 0., 1., 0.],
[1., 0., 0., 1., 0., 0., 0., 0., 1., 1., 0.],
[1., 0., 0., 1., 0., 1., 0., 0., 0., 1., 0.],
[1., 0., 0., 1., 0., 0., 0., 1., 0., 1., 0.],
[1., 0., 0., 1., 0., 0., 1., 0., 0., 1., 0.],
[1., 0., 0., 1., 0., 0., 1., 0., 0., 1., 0.],
[1., 0., 0., 1., 0., 1., 0., 0., 0., 1., 0.],
[1., 0., 0., 1., 0., 0., 1., 0., 0., 1., 0.],
[0., 1., 1., 0., 0., 0., 1., 0., 0., 0., 1.]])
[1., 0., 1., 0., 0., 0., 0., 1., 0., 0., 1.],
[1., 0., 0., 1., 0., 1., 0., 0., 0., 1., 0.]])
```
@ -162,16 +162,16 @@ array([[1., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0.],
```console
#First 10 rows:
array([[1., 2., 5., 0., 1.],
[1., 3., 4., 0., 1.],
[1., 2., 4., 0., 1.],
[1., 3., 2., 0., 1.],
[1., 4., 3., 0., 1.],
[1., 4., 5., 0., 0.],
[2., 5., 4., 0., 1.],
[2., 5., 8., 0., 1.],
[0., 2., 3., 0., 2.],
[1., 3., 6., 4., 2.]])
array([[2., 5., 2., 0., 1.],
[2., 5., 2., 0., 0.],
[2., 5., 4., 5., 2.],
[1., 4., 5., 1., 1.],
[2., 5., 5., 0., 2.],
[1., 2., 1., 0., 1.],
[1., 2., 8., 0., 1.],
[2., 5., 2., 0., 0.],
[2., 5., 5., 0., 2.],
[1., 2., 3., 0., 0.]])
```
@ -180,8 +180,8 @@ array([[1., 2., 5., 0., 1.],
```console
# First 2 rows:
array([[1., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0., 1., 2., 5., 0., 1.],
[1., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0., 1., 3., 4., 0., 1.]])
array([[1., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0., 2., 5., 2., 0., 1.],
[1., 0., 1., 0., 0., 1., 0., 0., 0., 1., 0., 2., 5., 2., 0., 0.]])
```
---

View File

@ -1,10 +1,12 @@
## Financial strategies on the SP500
In this project we will apply machine to finance. You are a Quant/Data Scientist and your goal is to create a financial strategy based on a signal outputted by a machine learning model that over-performs the [SP500](https://en.wikipedia.org/wiki/S%26P_500).
In this project, you'll apply machine learning to finance. Your goal as a Quant/Data Scientist is to create a financial strategy that uses a signal generated by a machine learning model to outperform the [SP500](https://en.wikipedia.org/wiki/S%26P_500).
The Standard & Poor's 500 Index is a collection of stocks intended to reflect the overall return characteristics of the stock market as a whole. The stocks that make up the S&P 500 are selected by market capitalization, liquidity, and industry. Companies to be included in the S&P are selected by the S&P 500 Index Committee, which consists of a group of analysts employed by Standard & Poor's.
The S&P 500 Index originally began in 1926 as the "composite index" comprised of only 90 stocks. According to historical records, the average annual return since its inception in 1926 through 2018 is approximately 10%11%. The average annual return since adopting 500 stocks into the index in 1957 through 2018 is roughly 8%.
As a Quant Researcher, you may beat the SP500 one year or few years. The real challenge though is to beat the SP500 consistently over decades. That's what most hedge funds in the world are trying to do.
The S&P 500 Index is a collection of 500 stocks that represent the overall performance of the U.S. stock market. The stocks in the S&P 500 are chosen based on factors like market value, liquidity, and industry. These selections are made by the S&P 500 Index Committee, which is a group of analysts from Standard & Poor's.
The S&P 500 started in 1926 with only 90 stocks and has grown to include 500 stocks since 1957. Historically, the average annual return of the S&P 500 has been about 10-11% since 1926, and around 8% since 1957.
As a Quantitative Researcher, your challenge is to develop a strategy that can consistently outperform the S&P 500, not just in one year, but over many years. This is a difficult task and is the primary goal of many hedge funds around the world.
The project is divided in parts:
@ -199,4 +201,5 @@ Note: `features_engineering.py` can be used in `gridsearch.py`
### Files for this project
You can find the data required for this project in this [link](https://assets.01-edu.org/ai-branch/project4/project04-20221031T173034Z-001.zip)
You can find the data required for this project in this :
[link](https://assets.01-edu.org/ai-branch/project4/project04-20221031T173034Z-001.zip)

View File

@ -1,82 +1,101 @@
## stealth-boom
In this project, you will have to create an entire stealth game using Unreal Engine.
A stealth game.
<center>
<img src="./resources/mgsmeme.png?raw=true" style = "width: 700px !important; height: 464px !important;"/>
</center>
### Objectives
The idea of this project is to create a little 10 minutes gameplay game with missions, with stealth based gameplay and with an AI patrolling NPC.
The goal of this project is to create a playable gameplay loop around stealth mechanics.
The basics assets you will need for this project can be found in the [Stealth-Boom.zip](https://assets.01-edu.org/Unreal-Engine-Projects/StealthBoom.zip) file. It contains the basic animations, character, enemies and props you will need for this project.
### Scenario
You are an Unreal Engine developer tasked with creating a complete stealth game from scratch. You need to implement everything, including AI behavior trees, player animations, and gun mechanics. As you progress, you realize balancing these elements—ensuring the AI responds accurately to player movements while maintaining smooth gameplay—is more challenging than expected. Every feature, from the main menu to mission completion, must work seamlessly to deliver a polished and playable game within the given constraints.
Good luck. and remember to have fun while making the game!
### Instructions
The following aspects should be fulfilled:
The following requirements should be fulfilled:
- Create a map where the player can walk around.
#### Main Menu:
- This map should contain places for the player to hide from enemies by crouching, hide behind walls, and all other props you may use to help it make a stealth game.
- Buildings with at least 2 floors.
- Pickable ammunition and weapons around the map.
- Option to start the game.
- Adjust the general sound of the game.
- Change graphics settings:
- When changing the resolution, a pop-up should appear in the middle of the screen, asking if the player wants to keep the newly applied graphics settings. If the player clicks "Yes" within 10 seconds, the settings are confirmed. If the 10 seconds pass or the player clicks "No," the settings revert to the previous ones.
- Change the mouse sensitivity.
- Option to invert the mouse vertical axis.
- For the player you should add:
#### Map/Level.
- Walk and run animations
- Reload animation
- Aim animation
- Shoot animation
- Crouch animation
- the player should be able to do the above six actions while crouching
- Melee attack animation
- Gun sound when firing
- Bullets visual impact on walls (see decals documentation)
- Blood particles when hit
- The map should include areas where the player can hide from enemies by `ducking` or taking cover `behind walls and props`.
- There should be pickable ammunition scattered throughout the map.
- Health packs should be placed around the map for the player to collect.
- The game should contain a main menu where the player can:
#### Player:
- Start the game
- Adjust the general sound of the game
- Change the graphics settings
- When changing the resolution, a pop-up should appear in the middle of the screen asking if the player wants to keep the graphics setting he/she just applied. If `Yes` is clicked within 10 seconds, the settings are set, otherwise, if the 10 seconds delay is over or if the player clicks `No`, the settings go back to the old ones.
- Change the mouse sensitivity
- Invert the mouse vertical axis
- The player should have animations for `walking`, `running`, `shooting`, `ducking`, and performing `melee attacks`.
- Blood particles should appear when the player is hit.
- A health bar should decrease whenever the player takes damage.
- Upon death, the player should have the option to quit the game, return to the main menu, or start over.
- The player's mission is flexible and can be any of the following: completing a task, eliminating all enemies without being detected, or collecting documents. Regardless of the mission, the player will encounter enemies that attempt to hinder their progress.
- When the player successfully completes their mission, a pop-up should appear stating that the mission is complete.
- You should have at least 3 types of enemies: `Guards` (who patrol around and are lookig for intruders), `Drones` (same as Guards but which can fly), and `Cameras` (stationary and looking for intruders). More enemies can be added if you want to.
#### Gun mechanics:
- Guards AI:
- Guards should be able to patrol around the map;
- A Guard is able to see the player, if the player crosses his field of view;
- When the player enters the field of view of a Guard, the Guard enters into chasing mode and must start running after the player, takes cover and shoots at the player.
- If the player leaves the field of view for a certain time, the Guard goes back to patrol mode.
- Drones AI:
- Drones should be able to patrol around the map;
- A light color should determine the state of the drone (Blue for patrolling, Red for chasing the player);
- Once the player crosses the drone camera, the drone light turns red and the drone enters chasing mode;
- When a drone is in chasing mode, all the guards on the area are alerted, and should enter chasing mode as well;
- When the player is out of the drone sight, the drone turns back to patrol mode;
- The sight radius should be inferior on the drones that on the guards.
- Camera AI:
- Cameras should be placed on walls;
- Cameras should have the same light sign as the drone, so when the player is in the camera sight, the camera light turns red and all Guards enter in chasing mode;
- Like the Drones, Cameras warn guards, whenever the player passes through the camera field of view;
- Some Cameras should lock access of certain areas of the map (for example, close doors) when the player is detected by that camera.
- The player should be able to shoot.
- A widget should display the current number of bullets available to the player.
- When the bullet count reaches 0, the player should be unable to shoot.
- Shooting should trigger both a sound effect and a visual effect.
- Bullets should have a visual impact on walls.
- Drones, Guards and Cameras should have sounds effect whenever they change from chase to patrol mode, as well as the other way around.
#### Enemy:
- All AI should be implemented using Behavior Trees except for the Camera.
- The game should feature at least two types of enemies: `Melee` and `Ranged`.
- Behavior trees should be used to implement enemy AI.
- Enemies should be able to patrol pre-defined paths around the map.
- Enemies should detect the player if the player enters their field of view.
- When the player enters the field of view of an enemy, the enemy enters into chasing mode and must start running after the player.
- `Ranged` enemies should take cover and shoot at the player.
- `melee` enemies should run close to the player and perform melee attacks.
- Enemies in chase mode alert nearby enemies making them enter chase mode as well.
- If the player leaves the field of view of all enemies for a specified duration, the enemies go back to patrol mode.
- Enemies should have sound effects whenever they change from chase to patrol mode,and vice versa.
- Enemies should have a visual indicator showing whether they are in patrol or chase mode
- The player mission is up to you, either it can be some task to fix, kill all guards without getting caught or collect documents... Whatever you choose, the player should have enemies on his way to divert him away from his objective.
#### Game loop
- When the player successfully completes his mission, a pop up should appear saying that the mission is completed.
- Pressing `Esc` pauses the game, bringing up a widget similar to the main menu.
- The player should be able to change the game graphics settings the same way as in the main menu.
- The game should last no longer than 6 minutes. After this time, the player is presented with the same choices as when they die: quit the game, return to the main menu, or start over.
- The player has a health bar that should go down whenever the player gets shot. When the player dies, he has the choice to either quit the game, go back to the main menu or start over.
### Bonus
- If the player starts over, the level should not be reloaded. The player should spawn back to the starting point.
- Use your own assets to create a game in your own style by either searching online or creating them on your own.
- When pressing `Esc` the game is set on paused and the main menu widget pops up.
- Have more enemy types e.g. a turret that is stationary but inflicts significant damage, etc.
- The player should be able to change the game graphics setting exactly like in the main menu.
- Have areas in the game that are locked behind doors that require keys. which you can obtain from specific enemies.
- A game should not last longer than 6 minutes. After that the same choices should appear as when the player dies.
- Have multiple different weapon types that you can pick up around the map and use to finish the mission.
### Resources
Here are some resources to help you tackle potential project challenges:
- [behavior-tree-in-unreal-engine](https://dev.epicgames.com/documentation/en-us/unreal-engine/behavior-tree-in-unreal-engine---quick-start-guide)
- [decal materials and decal actors](https://dev.epicgames.com/documentation/en-us/unreal-engine/decal-materials-in-unreal-engine?application_version=5.4)
- for inspiration look at games like metal gear solid 1/2/3.
- The basics assets you will need for this project can be found in the [StealthBoomAssets.zip](https://assets.01-edu.org/gamedev/stealth-boom-assets.zip) file. It contains the basic animations, character, enemies and props you will need for this project.
> NOTE: The assets in the file are intended to help streamline the process of locating assets, not to eliminate it.
> TIP: Use [itch.io](https://www.itch.io) to get sound effects (not included in the assets file) or to find extra assets
### Submission
> Do not forget to zip up the project compile and save everything for peer correction.
> If it is not possible to upload files to Gitea due to their size, use GitHub instead and have a look at [Git LSF](https://docs.github.com/en/repositories/working-with-files/managing-large-files/about-large-files-on-github)

View File

@ -1,69 +1,89 @@
> Due to file size reason, the solution might be uploaded on GitHub instead of Gitea!
#### Functional
#### Main Menu
###### Does the map contain places for the player to hide from enemies?
###### Is the main menu displayed on the screen with all five options visible?
###### Does the map contain buildings, pickable ammunition and weapons placed around the map?
###### Can the general game sound be adjusted directly from the settings menu?
###### Does the player have all the minimal animation required (walking, running, melee attacking, aiming, reloading, shooting, crouching, crouch walking, crouch aiming, crouch reloading, crouch shooting)?
###### When changing the resolution, does a confirmation pop-up appear in the center of the screen asking if the player wants to keep the new graphics settings?
###### Is there a sound for the player shooting?
###### Are there bullets impacts present when shooting at a wall?
###### When the player is hit, are there any blood particles?
###### Is a main menu with the five options displayed on the screen?
###### Can the general sound of the game be managed directly on the settings menu?
###### When changing the resolution, does a pop-up get displayed on the screen to confirm the changes we just set?
###### Does pressing “No” on the graphics confirmation pop-up, resets the settings to the old ones?
###### If the player presses 'No' on the graphics confirmation pop-up, or if the pop-up is not confirmed within 10 seconds, do the settings revert to the previous ones?
###### Are the mouse settings (mouse sensitivity, invert mouse vertical axis) functioning according to their descriptions?
###### Do the guards and drones wander around the map?
#### Map / Level
###### When a player enters the field of view of a guard, does he switches to chasing mode, running and shooting towards the player while also taking cover?
###### Does the map contain props/walls for the player to hide from enemies?
###### Does the drone light switches between each state? Blue for patrolling and red for chase mode (when a player crosses its sight)?
###### Does the map contain pickable ammunition and health placed around?
###### Whenever a drone turns to chasing mode, do all the guards in the area get alerted and switch to chasing mode as well?
#### Player Mechanics
###### Does the drone come back to patrol mode when the player is out of the drone sight?
###### Is the sight radius of the drones smaller than the guards?
###### Are cameras attached to walls?
###### Do cameras have similar light sign as the drones (red for alert mode and blue for patrol)?
###### As the drones, do the cameras alert guards on the area, switching them to chasing mode?
###### Do Guards, Drones and Cameras play an alert sound when a player gets detected?
###### Do some cameras lock access to certain areas of the map, when they detect a player?
###### Can the camera close some part of the map (thru closed doors, open traps and tries to kill the player etc…) to the player when the player is being detected?
###### Are Behavior Trees used to implement the AI of the Guards and Drones?
###### Does the player have a goal?
###### When the goal of the player is successfully completed, does a pop up appear saying that the mission is completed?
###### Does the player have all the minimal animation required (walking, running, melee attacking, shooting, ducking)?
###### Does the player have a health bar?
###### Does the player health decreases when he gets shot by the guards?
###### Does the player health decreases when he gets damaged?
###### When the player is hit, are there any blood particles?
###### When the player loses all his health (dies), does he get to choose whether to quit the game, go back to the main menu, or to start over?
###### If the player starts over does he spawn back at the starting point?
###### Does the player have a defined goal or mission?
###### Is the lifespan of the game at least 6 minutes long from launch to mission completed?
###### Whatever the goal or mission, can you confirm that the player had enemies on his way to divert him away from his objective?
###### When the player successfully completes the goal or mission, does a pop-up appear indicating `mission completion`?
#### Gun Mechanics
###### Is the player able to shoot?
###### Is there a sound for the guns shooting?
###### Is a widget showing the remaining bullet count displayed?
###### Is the player unable to shoot when he has no bullets?
###### Are there bullets impacts present when shooting at a wall?
#### Enemies
###### Does the game include at least two types of enemies: `Melee` and `Ranged`?
###### Is enemy AI implemented using Behavior Trees?
###### Do the enemies wander around the map?
###### When a player enters an enemy's field of view, does the enemy switch to a chasing mode?
###### Do melee enemies approach the player to perform melee attacks?
###### Do ranged enemies take cover and shoot at the player?
###### Do the enemies go back to patrol mode when the player is hidden from all field of views for a set amount of time?
###### Do the enemies have a sound effect and visual effect when entering and exiting chase mode?
###### Are enemies in chase mode alerting nearby enemies ?
#### Game loop
###### When 'Esc' is pressed does the game pause and a widget similar to the main menu is displayed?
###### Can the player perform all the actions that appear in the menu when the game is paused?
###### Does the game loop last no more than 6 minutes from start to finish?
###### Is a widget that shows the player's remaining time displayed?
#### Bonus
###### +Are there headshots implemented?
###### +Did the student use different assets than the ones provided in the subject?
###### +Are there more enemy types than the basic melee and ranged enemies?
###### +Are there areas in the game that require an item to access?
###### +Are there different weapon types that you can pick up and use?

Binary file not shown.

After

Width:  |  Height:  |  Size: 459 KiB

View File

@ -1,7 +1,7 @@
blinker==1.6.2
click==8.1.6
Flask==2.3.2
Flask-Cors==4.0.1
Flask-Cors==5.0.0
itsdangerous==2.1.2
Jinja2==3.1.4
MarkupSafe==2.1.3

View File

@ -7,6 +7,7 @@ Create two functions which takes an object and a string or array of strings. The
- `omit`: contains only those keys which do not match the string, or do not appear in the array of strings.
> Those functions are pure and must not modify the given object
> It should ignore properties from the prototype chain
### Notions