A Django-based REST API server that provides detailed information about airports worldwide, including real-time timezone data.
- Comprehensive airport information (IATA codes, coordinates, elevation, etc.)
- Real-time timezone data via Google Maps API
- Regular data updates from OurAirports database
- RESTful API endpoints with JSON responses
- CORS support for cross-origin requests
- No authentication required for API access
- Install PostgreSQL:
# Ubuntu/WSL
sudo apt update
sudo apt install postgresql postgresql-contrib
# Start PostgreSQL service
sudo service postgresql start
# Create database and user
sudo -u postgres psql
postgres=# CREATE DATABASE airfield_info;
postgres=# CREATE USER your_user WITH PASSWORD 'your_password';
postgres=# GRANT ALL PRIVILEGES ON DATABASE airfield_info TO your_user;
postgres=# \q- Install Redis (Optional - only if using caching):
# Ubuntu/WSL
sudo apt update
sudo apt install redis-server
# Start Redis service
sudo service redis-server start
# Test Redis connection
redis-cli ping # Should return PONG- Install Python 3.11:
# Ubuntu/WSL
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update
sudo apt install python3.11 python3.11-dev python3.11-venv- Install Poetry:
curl -sSL https://install.python-poetry.org | python3 -- Clone the repository:
git clone https://github.com/Consiliency/Airfield-Info-Server.git
cd Airfield-Info-Server- Install dependencies using Poetry:
poetry install- Set up environment variables in
.env:
DEBUG=True
SECRET_KEY=your_secret_key
DATABASE_URL=postgres://your_user:your_password@localhost:5432/airfield_info
ALLOWED_HOSTS=localhost,127.0.0.1
GOOGLE_MAPS_API_KEY=your_google_maps_api_key # Required for timezone updates
# Redis settings (optional - only if using caching)
REDIS_URL=redis://localhost:6379/0
USE_REDIS_CACHE=False # Set to True to enable Redis caching- Verify services are running:
# Check PostgreSQL status
sudo service postgresql status
# Check Redis status (if using caching)
sudo service redis-server status- Run migrations:
poetry run python manage.py migrate- Start the development server:
poetry run python manage.py runserverpg_dump -U your_user -d airfield_info > backup.sqlpsql -U your_user -d airfield_info < backup.sqlpoetry run python manage.py reset_db # Requires django-extensions
poetry run python manage.py migrate- If PostgreSQL service fails to start:
sudo systemctl status postgresql # Check detailed status
sudo journalctl -u postgresql # Check logs- If connection fails:
- Verify PostgreSQL is running
- Check database URL in .env
- Ensure user has correct permissions
- If Redis service fails to start:
sudo systemctl status redis-server # Check detailed status
sudo journalctl -u redis-server # Check logs- If connection fails:
- Verify Redis is running
- Check Redis URL in .env
- Test connection:
redis-cli ping
- If timezone updates fail:
- Verify Google Maps API key is valid
- Check API quota limits
- Ensure internet connectivity
- If database migrations fail:
- Verify PostgreSQL is running
- Check database permissions
- Review migration logs
The API endpoints are publicly accessible and do not require authentication or API keys. The Google Maps API key is only used server-side for updating timezone information.
GET /api/airports/by_iata/?code={iata_code}&include_timezone=true
Parameters:
code(required): IATA airport code (e.g., LAX, JFK)include_timezone(optional): Set to "true" to include current timezone information
Example Request:
curl "http://localhost:8000/api/airports/by_iata/?code=LAX&include_timezone=true"Example Response:
{
"id": "26434",
"ident": "VABB",
"iata_code": "BOM",
"name": "Chhatrapati Shivaji International Airport",
"type": "large_airport",
"latitude": "19.088699",
"longitude": "72.867897",
"elevation_ft": 39.0,
"continent": "AS",
"iso_country": "IN",
"iso_region": "IN-MM",
"municipality": "Mumbai",
"scheduled_service": true,
"gps_code": "VABB",
"local_code": null,
"home_link": "http://www.csia.in/",
"wikipedia_link": "https://en.wikipedia.org/wiki/Chhatrapati_Shivaji_International_Airport",
"keywords": "Bombay, Sahar International Airport",
"timezone": {
"timezone_id": "Asia/Calcutta",
"timezone_name": "India Standard Time",
"raw_offset": 19800,
"dst_offset": 0,
"total_offset": 5.5,
"last_updated": "2025-01-08T09:43:20.622030Z",
"aliases": [
"Asia/Kolkata"
]
},
"updated": "2025-01-08T09:43:20.641220Z"
}GET /api/airports/by_icao/?code={icao_code}&include_timezone=true
Parameters:
code(required): ICAO airport code (e.g., KLAX, KJFK)include_timezone(optional): Set to "true" to include current timezone information
Example Request:
curl "http://localhost:8000/api/airports/by_icao/?code=KLAX&include_timezone=true"Response format is the same as the IATA endpoint.
The server automatically checks for updates from the OurAirports database every 7 days. The data source is:
https://davidmegginson.github.io/ourairports-data/airports.csv
- Timezone data is fetched from the Google Maps Time Zone API (server-side only)
- Timezone information is cached and updated every 90 days
- The
total_offsetfield includes both the raw UTC offset and any DST offset - Times are returned in ISO 8601 format with UTC timezone
- Install development dependencies:
poetry install --with dev- Run tests:
poetry run python manage.py test-
Set up AWS Account and CLI
- Create an AWS account if you don't have one
- Install AWS CLI and configure credentials:
aws configure
-
Set up AWS Elastic Beanstalk
- Install EB CLI:
pip install awsebcli
- Initialize EB project:
eb init -p python-3.10 airfield-info-server
-
Create Required AWS Files
Create .ebextensions/01_packages.config:
packages:
yum:
postgresql-devel: []
python3-devel: []
gcc: []Create .ebextensions/02_python.config:
option_settings:
aws:elasticbeanstalk:container:python:
WSGIPath: config.wsgi:application
aws:elasticbeanstalk:environment:proxy:staticfiles:
/static: staticfilesCreate Procfile:
web: gunicorn config.wsgi --bind 0.0.0.0:8000
-
Update Dependencies
poetry add gunicorn poetry export -f requirements.txt --output requirements.txt --without-hashes -
Configure Production Settings
Create config/production.py:
from .settings import *
DEBUG = False
ALLOWED_HOSTS = ['your-eb-url.elasticbeanstalk.com', 'your-domain.com']
# Configure static files for S3
AWS_STORAGE_BUCKET_NAME = 'your-bucket-name'
AWS_S3_REGION_NAME = 'your-region'
AWS_S3_CUSTOM_DOMAIN = f'{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com'
STATIC_URL = f'https://{AWS_S3_CUSTOM_DOMAIN}/static/'
# Configure database
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ['RDS_DB_NAME'],
'USER': os.environ['RDS_USERNAME'],
'PASSWORD': os.environ['RDS_PASSWORD'],
'HOST': os.environ['RDS_HOSTNAME'],
'PORT': os.environ['RDS_PORT'],
}
}-
Create Elastic Beanstalk Environment
eb create airfield-info-prod --database --database.engine postgres
-
Configure Environment Variables
- Go to AWS Elastic Beanstalk Console
- Select your environment
- Go to Configuration → Software
- Add environment variables:
DJANGO_SETTINGS_MODULE=config.productionGOOGLE_MAPS_API_KEY=your-keySECRET_KEY=your-secret-key
-
Deploy Application
eb deploy
-
Configure AWS Credentials
- Go to AWS IAM Console
- Create a new IAM user for GitHub Actions
- Add the
AWSElasticBeanstalkFullAccesspolicy - Save the Access Key ID and Secret Access Key
-
Set up GitHub Secrets
- Go to your GitHub repository settings
- Navigate to Secrets and Variables → Actions
- Add the following secrets:
AWS_ACCESS_KEY_IDAWS_SECRET_ACCESS_KEYAWS_REGION
-
Create GitHub Actions Workflow Create
.github/workflows/deploy.yml:name: Deploy to AWS Elastic Beanstalk on: push: branches: [ main ] jobs: deploy: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Set up Python uses: actions/setup-python@v2 with: python-version: '3.10' - name: Install dependencies run: | python -m pip install --upgrade pip pip install poetry poetry export -f requirements.txt --output requirements.txt --without-hashes - name: Deploy to EB uses: einaregilsson/beanstalk-deploy@v21 with: aws_access_key: ${{ secrets.AWS_ACCESS_KEY_ID }} aws_secret_key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} application_name: airfield-info-server environment_name: airfield-info-prod region: ${{ secrets.AWS_REGION }} deployment_package: .
Now your application will automatically deploy to AWS Elastic Beanstalk whenever you push to the main branch.
-
Get Elastic Beanstalk Domain
- Note your EB URL (e.g.,
your-app.elasticbeanstalk.com)
- Note your EB URL (e.g.,
-
Create Route 53 Hosted Zone
- Go to Route 53 in AWS Console
- Create a hosted zone for your domain
- Note the nameservers provided by AWS
-
Update GoDaddy Nameservers
- Log in to GoDaddy
- Go to Domain Settings → Nameservers
- Select "Custom" and add AWS nameservers
- Save changes (may take 24-48 hours to propagate)
-
Configure DNS in Route 53
- Create an A record:
- Name: @ (root domain)
- Type: A
- Alias: Yes
- Target: Your Elastic Beanstalk environment
- Create CNAME for www subdomain if needed
- Create an A record:
-
Update Elastic Beanstalk Configuration
- Add your domain to ALLOWED_HOSTS in production settings
- Update environment variables if needed
- Deploy changes:
eb deploy
-
Request SSL Certificate
- Go to AWS Certificate Manager
- Request a certificate for your domain
- Add both root and www versions
- Verify ownership through DNS validation
-
Configure Load Balancer
- In Elastic Beanstalk, modify environment configuration
- Add HTTPS listener on port 443
- Select your SSL certificate
- Force HTTPS by redirecting HTTP to HTTPS
- Set up CloudWatch alarms for monitoring
- Configure automatic backups for RDS database
- Set up S3 lifecycle policies for logs
- Enable AWS X-Ray for performance monitoring
Additional environment variables for production:
DJANGO_SETTINGS_MODULE: Set to 'config.production'RDS_DB_NAME: PostgreSQL database nameRDS_USERNAME: Database usernameRDS_PASSWORD: Database passwordRDS_HOSTNAME: Database hostRDS_PORT: Database portAWS_ACCESS_KEY_ID: AWS access keyAWS_SECRET_ACCESS_KEY: AWS secret keyAWS_STORAGE_BUCKET_NAME: S3 bucket for static files
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Create an AWS account if you haven't already.
- Use AWS Elastic Beanstalk for easy deployment.
# Update package list and install prerequisites
sudo apt-get update
sudo apt-get install -y unzip curl
# Download the AWS CLI installation file
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
# Unzip the installer
unzip awscliv2.zip
# Run the install program
sudo ./aws/install
# Verify the installation
aws --versionFor other operating systems, download and install the AWS CLI from the AWS CLI website.