Skip to content
This repository has been archived by the owner on Oct 25, 2024. It is now read-only.

Test preparation for week 05 #75

Open
wants to merge 31 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
9e16337
Add skeleton of earthquakes solution
ageorgou Oct 23, 2020
806071e
Earthquake solution after testing
goldsmdn Oct 25, 2020
c164fc5
Fixed code to add dates for the last century
goldsmdn Oct 26, 2020
ef08d9f
Graph
goldsmdn Oct 29, 2020
5cffccc
Answers UCL-RITS/rse-classwork-2020#61
goldsmdn Nov 2, 2020
63dee6a
Answers UCL-RITS/rsd-classwork-2020#61
goldsmdn Nov 3, 2020
34fafd3
Classwork
goldsmdn Nov 5, 2020
a4cd21e
Paramaterisation
goldsmdn Nov 5, 2020
12afdb0
travis changes
goldsmdn Nov 5, 2020
c0d7cd0
YAML change
goldsmdn Nov 7, 2020
bdf42e1
Changes to YAML exercise
goldsmdn Nov 7, 2020
e4e7a31
Tidy Up
goldsmdn Nov 7, 2020
c8f7e3b
Commit yaml file to main
goldsmdn Nov 7, 2020
d3a9c21
travis yaml file
goldsmdn Nov 7, 2020
f46c258
YAML file
goldsmdn Nov 7, 2020
808ed2c
Travis file in head
goldsmdn Nov 7, 2020
c6bc43b
Travis file in main
goldsmdn Nov 7, 2020
6c7e0eb
Travis file
goldsmdn Nov 7, 2020
ed8fa03
Repush changes
goldsmdn Nov 7, 2020
22347b5
Travis
goldsmdn Nov 7, 2020
5d09280
fixture
goldsmdn Nov 7, 2020
9baeb03
Merge branch 'testing' of github.com:goldsmdn/rse-classwork-2020 into…
goldsmdn Nov 7, 2020
648b6db
fixtures
goldsmdn Nov 7, 2020
03f04f4
tidy up
goldsmdn Nov 7, 2020
2e78ec3
ISS calls
goldsmdn Nov 8, 2020
fbd54e7
Minor changes
goldsmdn Nov 8, 2020
2671630
travis cd change
goldsmdn Nov 9, 2020
89739d6
Week07 exercise
goldsmdn Nov 26, 2020
66242a6
Week 07 work
goldsmdn Nov 26, 2020
95248eb
Week 07 work
goldsmdn Nov 26, 2020
56be85c
Created using Colab
goldsmdn Jan 24, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
## yaml file for travis
language: python
python:
- "3.8"
# command to install dependencies
install:
pip install requests pyyaml pytest pytest-cov
# command to run tests
script:
- cd week05-testing
- pytest --cov
156 changes: 156 additions & 0 deletions MyFirstColabNotebook.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,156 @@
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"provenance": [],
"authorship_tag": "ABX9TyOPlIfISBZfaHXcsqARRLam",
"include_colab_link": true
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
}
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/github/goldsmdn/rse-classwork-2020/blob/testing/MyFirstColabNotebook.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "code",
"source": [
"time.sleep(5)\n",
"print (time.ctime())\n"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "O9oxYXdBIqbw",
"outputId": "66a38f09-f582-41ca-9be0-42d3d5ec03bb"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Fri Jan 24 16:34:46 2025\n"
]
}
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "5oFavWTQG3eU",
"outputId": "bd234d1f-c978-42b1-88ee-1be210fc625c"
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Fri Jan 24 16:34:41 2025\n"
]
}
],
"source": [
"import time\n",
"print(time.ctime())"
]
},
{
"cell_type": "markdown",
"source": [],
"metadata": {
"id": "FHVD_VJ3LAK4"
}
},
{
"cell_type": "markdown",
"source": [
"This is **bold**\n",
"This is *italic*\n",
"This is ~strikethrough~"
],
"metadata": {
"id": "L4dShNKlLDlc"
}
},
{
"cell_type": "markdown",
"source": [
"$\\sqrt{3x-1}+(1+x)^2$"
],
"metadata": {
"id": "jLFXfJ_kLQbW"
}
},
{
"cell_type": "markdown",
"source": [
"$e^x = \\sum_{i = 0}^\\infty \\frac{1}{i!}x^i$"
],
"metadata": {
"id": "wbTdUXr1LhZ5"
}
},
{
"cell_type": "code",
"source": [],
"metadata": {
"id": "oz5FolwCLpou"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"Constraints are\n",
" - $3x_1 + 6x_2 + x_3 =< 28$\n",
" - $7x_1 + 3x_2 + 2x_3 =< 37$\n",
" - $4x_1 + 5x_2 + 2x_3 =< 19$\n",
" - $x_1,x_2,x_3 >=0 $\n",
"\n",
"The trial vector is calculated as follows:\n",
"- $u_i(t) = x_i(t) + \\beta(\\hat{x}(t) − x_i(t)) + \\beta \\sum_{k = 1}^{n_v}(x_{i1,k}(t) − x_{i2,k}(t))$\n",
"$f(x_1, x_2) = 20 + e - 20exp(-0.2 \\sqrt {\\frac {1}{n} (x_1^2 + x_2^2)}) - exp (\\frac {1}{n}(cos(2\\pi x_1) + cos(2\\pi x_2))$\n",
"\n",
"$x ∈ [-5, 5]$\n",
">$A_{m,n} =\n",
" \\begin{pmatrix}\n",
" a_{1,1} > a_{1,2} > \\cdots > a_{1,n} \\\\\n",
" a_{2,1} > a_{2,2} > \\cdots > a_{2,n} \\\\\n",
" \\vdots > \\vdots > \\ddots > \\vdots \\\\\n",
" a_{m,1} > a_{m,2} > \\cdots > a_{m,n}\n",
" \\end{pmatrix}$"
],
"metadata": {
"id": "84QMwRsxLvWe"
}
},
{
"cell_type": "markdown",
"source": [],
"metadata": {
"id": "AENi3b1fLxDP"
}
}
]
}
40 changes: 40 additions & 0 deletions week04/quakes.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
"""A script to find the biggest earthquake in an online dataset."""

# At the top of the file, import any libraries you will use.
import requests
import json

# When you run the file, it should print out the location and magnitude
# of the biggest earthquake.
# You can run the file with `python quakes.py` from this directory.
if __name__ == "__main__":
quakes = requests.get("http://earthquake.usgs.gov/fdsnws/event/1/query.geojson",
params={
'starttime': "1900-01-01",
"maxlatitude": "58.723",
"minlatitude": "50.008",
"maxlongitude": "1.67",
"minlongitude": "-9.756",
"minmagnitude": "1",
"endtime": "1999-12-31",
"orderby": "time-asc"}
)
#initialize variables for the loop
max_magnitude = 0
coords = ''

#loop around the structure
for items in quakes.json()["features"]:
events = items['properties']
geometries = items['geometry']
if events['mag'] > max_magnitude:
# new biggest magnitude - reset variables
max_magnitude = events['mag']
coords= geometries['coordinates']


# The results are stored in variables
# named max_magnitude and coords,

print(f"The maximum magnitude is {max_magnitude} "
f"and it occured at coordinates {coords}.")
57 changes: 57 additions & 0 deletions week04/quakes_graph.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
"""A script to plot average earthquake in an online dataset."""

# At the top of the file, import any libraries you will use.
import requests
import json
import datetime
import time
import matplotlib.pyplot as plt

#constants
START = 2000 # start year = need to align to selection below
POINTS = 21 # number of years considered
STARTTIME = "2000-01-01" # Start time in string format
ENDTIME = "2020-12-31" # End time in string format

# When you run the file, it should graph the average magnitude of quakes by year.
# You can run the file with `python graphs.py` from this directory.
if __name__ == "__main__":
quakes = requests.get("http://earthquake.usgs.gov/fdsnws/event/1/query.geojson",
params={
'starttime': STARTTIME,
"maxlatitude": "58.723",
"minlatitude": "50.008",
"maxlongitude": "1.67",
"minlongitude": "-9.756",
"minmagnitude": "1",
"endtime": ENDTIME,
"orderby": "time-asc"}
)

#define and initialise lists
years = [x+ START for x in range(POINTS)]
counts = [0 for x in range(POINTS)]
mags = [0 for x in range(POINTS)]
averages = [0 for x in range(POINTS)]
#loop around the whole structure
#find year and update count and mags
for items in quakes.json()["features"]:
events = items['properties']
timestring = time.gmtime(events['time']/1000)
year = (time.strftime("%Y ", timestring))
int_year = int(year)
index = int_year - START
counts[index] = counts[index]+1
mags[index] = mags[index]+events['mag']

# calculate average from counts
for i in range(POINTS):
if counts[i]>0:
averages[i] = mags[i]/counts[i]

plt.plot(years, averages, 'ro')
plt.style.use('ggplot')
plt.xlabel('Year')
plt.ylabel('Frequency')
plt.title('Earthquake analysis by year')
plt.show()
Binary file added week05-testing/.coverage
Binary file not shown.
Binary file not shown.
Binary file added week05-testing/__pycache__/times.cpython-38.pyc
Binary file not shown.
18 changes: 18 additions & 0 deletions week05-testing/fixture.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
## fixtures.yaml file#
- generic:
time_range_1: ["2010-01-12 10:00:00", "2010-01-12 12:00:00"]
time_range_2: ["2010-01-12 10:30:00", "2010-01-12 10:45:00", 2, 60]
expected:
- ["2010-01-12 10:30:00","2010-01-12 10:37:00"]
- ["2010-01-12 10:38:00", "2010-01-12 10:45:00"]
- test_non_overlap:
time_range_1: ["2010-01-12 10:00:00", "2010-01-12 12:00:00"]
time_range_2: ["2010-01-12 14:30:00", "2010-01-12 14:45:00", 2, 60]
expected:
[]
- test_several_interval:
time_range_1: ["2010-01-12 10:00:00", "2010-01-12 11:00:00", 2, 60]
time_range_2: ["2010-01-12 10:30:00", "2010-01-12 10:45:00", 2, 60]
expected:
- ["2010-01-12 10:30:30", "2010-01-12 10:37:00"]
- ["2010-01-12 10:38:00", "2010-01-12 10:45:00"]
24 changes: 24 additions & 0 deletions week05-testing/passes.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
#passes.py
def iss_passes(lat = 51.482218, lon = -.264547, alt =10, n=5):
import requests
import json
import datetime

response = requests.get("http://api.open-notify.org/iss-pass.json",
params={
"lat": lat,
"lon": lon,
"alt": alt,
"n": n})


passes=response.json()['response']

print(response)

# for items in passes:
# print

# return [(datetime.datetime.fromtimestamp(item['risetime']).strftime("%Y-%m-%d %H:%M:%S"),
# (datetime.datetime.fromtimestamp(item['risetime'] + item['duration'])).strftime("%Y-%m-%d %H:%M:%S"))
# for item in passes]
68 changes: 68 additions & 0 deletions week05-testing/test_times.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
from times import compute_overlap_time, iss_passes, time_range
import yaml
import pytest
from pytest import raises
# needed to add to example to get to work.
import mock
import datetime

#testing file

with open('fixture.yml', 'r') as yamlfile:
fixture = yaml.safe_load(yamlfile)

@pytest.mark.parametrize("test_name", fixture)
# fixture is a list of dictionaries [{'generic':...}, {'no_overlap':...}, ...]

def test_eval(test_name):
# test_name will be a dictionary, e.g. for the first case: {'generic': {'time_range_1':..., 'time_range2':..., 'expected':...}
properties = list(test_name.values())[0]
first_range = time_range(*properties['time_range_1'])
second_range = time_range(*properties['time_range_2'])
expected = [(start, stop) for start, stop in properties['expected']]
assert compute_overlap_time(first_range, second_range) == expected

def test_time_back():

start_time = "2010-01-12 12:00:00"
end_time = "2010-01-12 10:00:00"

with raises(ValueError):
time_range(start_time, end_time)

class ISS_response:
'''
This class provides "hardcoded" return values to mock the calls to the online API.
'''
@property
def status_code(self):
return 200

def json(self):
# '''
# mocks the bit from the json output we need from querying the API.
# '''
now = datetime.datetime.now().timestamp()
return {'message': 'success',
'request': {'altitude': 10.0, 'datetime': now, 'latitude': 51.5074, 'longitude': -0.1278, 'passes': 5},
'response': [{'duration': 446, 'risetime': now + 88433},
{'duration': 628, 'risetime': now + 94095},
{'duration': 656, 'risetime': now + 99871},
{'duration': 655, 'risetime': now + 105676},
{'duration': 632, 'risetime': now + 111480}]}

def test_iss_passes():
with mock.patch("requests.get", new=mock.MagicMock(return_value=ISS_response())) as mymock:
iss_over_London = iss_passes(51.5074, -0.1278)
mymock.assert_called_with("http://api.open-notify.org/iss-pass.json",
params={
"lat": 51.5074,
"alt":10, # needed to add to example to get to work.
"lon": -0.1278,
"n": 5})
assert len(iss_over_London) == 5
# Create a range from yesterday to next week whether the overlap ranges are still 5
yesterday = datetime.datetime.now() - datetime.timedelta(days=1)
next_week = datetime.datetime.now() + datetime.timedelta(days=7)
large = time_range(f"{yesterday:%Y-%m-%d %H:%M:%S}", f"{next_week:%Y-%m-%d %H:%M:%S}")
assert compute_overlap_time(large, iss_over_London) == iss_over_London
Loading