Skip to content

Commit

Permalink
minor fixes
Browse files Browse the repository at this point in the history
  • Loading branch information
kcwongaz committed Aug 30, 2022
1 parent 6c8def0 commit 6c55406
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 4 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ This project requires the standard scientific packages, `numpy`, `scipy`, `matpl
## 2 - Data
An example dataset can be downloaded here.

The example dataset contains the flight data in Jan 2017. Decompressing the data to `data/` at the project root should get the jupyter notebooks to run.
The example dataset contains the flight data in Jan 2017. Decompressing the data to `data/` at the project root should get the jupyter notebooks running.

If you are interested to see the raw data, here is an example dataset. The raw data is quite large in file size, so I can only provide 3 days of data. To process the raw data, decompress the raw data to `raw/` at the project root, then run

Expand All @@ -40,9 +40,9 @@ The scripts in `pipeline/` perform successive processing to prepare the data, e.

<br>

## 3 - Quick walkthrough
## 3 - The package

`air_traffic/`: main package
Inside `air_traffic/`:
- `FR24Writer.py`, `filters.py`: for processing raw data
- `io.py`: I/O handlers
- `loop.py`: module for analyzing holding patterns and rescheduling
Expand Down
5 changes: 4 additions & 1 deletion pipeline/3_stat_fixed_distance.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
import pandas as pd
import os
import numpy as np

from air_traffic.trajectory import *

Expand All @@ -26,6 +25,10 @@
"delta_t_sec": []} # Time difference in second


# Create destination directory if not exist
if not os.path.exists(savedir):
os.makedirs(savedir, exist_ok=True)

for subdir, dirs, files in os.walk(datadir):

# Process in sorted order for easy tracking
Expand Down

0 comments on commit 6c55406

Please sign in to comment.