Skip to content

Commit ab11150

Browse files
committed
EDSC-4351: Implements AWS CDK
1 parent 154a326 commit ab11150

File tree

117 files changed

+27697
-14041
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

117 files changed

+27697
-14041
lines changed

.env

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
; These values shouldn't need to change for local development, and are not being used for
2+
; any secret values, so they do not need to be git ignored
3+
4+
AWS_ACCESS_KEY_ID=12345678
5+
AWS_REGION=us-east-1
6+
AWS_SECRET_ACCESS_KEY=12345678
7+
CACHE_KEY_EXPIRE_SECONDS=84000
8+
GENERATE_NOTEBOOKS_BUCKET_NAME=earthdata-search-dev-generate-notebooks
9+
NODE_ENV=development
10+
11+
USE_IMAGE_CACHE=false
12+
SKIP_SQS=true

.github/workflows/performance.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,4 +55,4 @@ jobs:
5555
path: |
5656
playwright-report/
5757
test-results/
58-
retention-days: 30
58+
retention-days: 30

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -24,3 +24,5 @@ playwright-coverage
2424
/playwright-report/
2525
/playwright/.cache/
2626
portals/availablePortals.json
27+
28+
elasticmq.conf

.nvmrc

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,2 +1 @@
11
lts/hydrogen
2-

README.md

Lines changed: 62 additions & 64 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,13 @@
11
# [Earthdata Search](https://search.earthdata.nasa.gov)
22

3-
[![serverless](http://public.serverless.com/badges/v3.svg)](http://www.serverless.com)
43
![Build Status](https://github.com/nasa/earthdata-search/workflows/CI/badge.svg?branch=main)
54
[![codecov](https://codecov.io/gh/nasa/earthdata-search/branch/main/graph/badge.svg?token=kIkZQ0NrqK)](https://codecov.io/gh/nasa/earthdata-search)
65
[![Known Vulnerabilities](https://snyk.io/test/github/nasa/earthdata-search/badge.svg)](https://snyk.io/test/github/nasa/earthdata-search)
76

87
## About
98

109
Earthdata Search is a web application developed by [NASA](http://nasa.gov) [EOSDIS](https://earthdata.nasa.gov) to enable data discovery, search, comparison, visualization, and access across EOSDIS' Earth Science data holdings.
11-
It builds upon several public-facing services provided by EOSDIS, including the [Common Metadata Repository (CMR)](https://cmr.earthdata.nasa.gov/search/) for data discovery and access, EOSDIS [User Registration System (URS)](https://urs.earthdata.nasa.gov) authentication, the [Global Imagery Browse Services (GIBS)](https://earthdata.nasa.gov/gibs) for visualization, and a number of OPeNDAP services hosted by data providers.
10+
It builds upon several public-facing services provided by EOSDIS, including the [Common Metadata Repository (CMR)](https://cmr.earthdata.nasa.gov/search/) for data discovery and access, EOSDIS [Earthdata Login (EDL)](https://urs.earthdata.nasa.gov) authentication, the [Global Imagery Browse Services (GIBS)](https://earthdata.nasa.gov/gibs) for visualization, and a number of OPeNDAP services hosted by data providers.
1211

1312
## License
1413

@@ -24,43 +23,26 @@ It builds upon several public-facing services provided by EOSDIS, including the
2423
2524
## Application Installation and Usage
2625

27-
The Earthdata Search application uses Node v18 and Vite 5 to generate static assets. The serverless application utilizes the following AWS services (important to note if deploying to an AWS environment):
26+
The Earthdata Search application uses NodeJS and Vite to generate static assets. The serverless application utilizes the following AWS services (important to note if deploying to an AWS environment):
2827

2928
- S3
3029
- We highly recommend using CloudFront in front of S3.
3130
- SQS
31+
- Step Functions
3232
- API Gateway
3333
- Lambda
3434
- Cloudwatch (Events)
3535

3636
### Prerequisites
3737

38-
##### Node
38+
#### NodeJS
3939

40-
Earthdata Search runs on Node.js, in order to run the application you'll need to [install it](https://nodejs.org/en/download/).
40+
We recommend using [Node Version Manager](https://github.com/nvm-sh/nvm?tab=readme-ov-file#installing-and-updating) (NVM) to manage your NodeJS install. Use the shell integration to [automatically switch Node versions](https://github.com/nvm-sh/nvm?tab=readme-ov-file#calling-nvm-use-automatically-in-a-directory-with-a-nvmrc-file).
4141

42-
**Recommended:** Use Homebrew
43-
44-
brew install node
45-
46-
##### NPM
47-
48-
npm is a separate project from Node.js, and tends to update more frequently. As a result, even if you’ve just downloaded Node.js (and therefore npm), you’ll probably need to update your npm. Luckily, npm knows how to update itself! To update your npm, type this into your terminal:
49-
50-
npm install -g npm@latest
51-
52-
##### NVM
53-
54-
To ensure that you're using the correct version of Node it is recommended that you use Node Version Manager. Installation instructions can be found on [the repository](https://github.com/nvm-sh/nvm#install--update-script). The version used is defined in .nvmrc and will be used automatically if NVM is configured correctly. Using nvm we can switch node versions to the one utilized by Earthdata Search. From the top-level directory:
42+
NVM will automatically install the correct node version defined in `.nvmrc`
5543

5644
nvm use
5745

58-
##### Running Serverless Framework locally
59-
60-
Earthdata Search utilizes the [Serverless Framework](https://serverless.com/) for managing AWS resources. In order to fully run and manage the application you'll need to install it:
61-
62-
npm install -g serverless@latest
63-
6446
##### PostgreSQL
6547

6648
Earthdata Search uses PostgreSQL in production on AWS RDS. If you don't already have it installed, [download](https://www.postgresql.org/download/) and install it to your development environment.
@@ -81,6 +63,30 @@ If you decide to install via Homebrew you'll need to create the default user.
8163

8264
createuser -s postgres
8365

66+
##### Docker
67+
68+
In order to simulate S3 locally we use [Minio](https://min.io/docs/minio/container/index.html) within a docker container.
69+
70+
##### Docker, Optional
71+
72+
In order to simulate SQS locally we use [ElasticMQ](https://github.com/softwaremill/elasticmq) within a docker container.
73+
74+
##### Redis, Optional
75+
76+
To use an image cache you need to have Redis installed.
77+
78+
**Recommended:** Use Homebrew
79+
80+
brew install redis
81+
82+
Optionally you can run Redis in a Docker container with
83+
84+
npm run start:cache
85+
86+
To stop the Redis Docker container
87+
88+
npm run stop:cache
89+
8490
### Initial Setup
8591

8692
##### Package Installation
@@ -117,82 +123,71 @@ Ensure that you have a database created:
117123

118124
To run the migrations locally:
119125

120-
DATABASE_URL=postgresql://USERNAME:PASSWORD@localhost:5432/edsc_dev npm run migrate up
121-
122-
Optionally, we can run the migration locally and not within a deployed Lambda. When deployed our database migrations run within Lambda due to the fact that in non-development environments our resources are not publicly accessible. To run the migrations you'll need to invoke the Lambda:
123-
124-
serverless invoke local --function migrateDatabase
126+
npm run invoke-local migrateDatabase
125127

126128
###### Creating a new database migration
127129

128130
To create a new database migration use this command to ensure the migration follow the same timestamp name scheme.
129131

130132
npm run migrate create name-of-migration
131133

132-
### Building the Application
133-
134-
The production build of the application will be output in the `/static/dist/` directory:
135-
136-
npm run build
137-
138-
This production build can be run locally with any number of http-server solutions. A simple one is to use the http-server package
139-
140-
npx http-server static/dist
141-
142134
### Run the Application Locally
143135

144136
The local development environment for the static assets can be started by executing the command below in the project root directory:
145137

146-
npm run start
138+
npm start
147139

148-
This will run the React application at [http://localhost:8080](http://localhost:8080) -- please see `Serverless Framework` below for enabling the 'server' side functionality.
140+
This will start everything you need to run Earthdata Search locally.
149141

150-
### Serverless Framework
142+
- React application: [http://localhost:8080](http://localhost:8080)
143+
- Mock API Gateway: [http://localhost:3001](http://localhost:3001)
144+
- Watch for code changes to the `serverless` directory
145+
- ElasticMQ container for SQS Queues.
146+
- Mock SQS service to trigger lambdas on SQS messages.
147+
- Mock S3 service for generating notebooks.
151148

152-
The [serverless framework](https://serverless.com/framework/docs/providers/aws/) offers many plugins which allow for local development utilizing many of the services AWS offers. For the most part we only need API Gateway and Lambda for this application but there are plugins for many more services (a list of known exceptions will be maintained below).
153-
154-
##### Exceptions
155-
156-
- SQS
149+
#### Optional Services
157150

158-
While there is an sqs-offline plugin for serverless it still requires an actual queue be running, we may investigate this in the future but for now sqs functionality isn't available while developing locally which means the following pieces of functionality will not operate locally:
151+
By default we don't run SQS or an image cache locally. In order to run the application with those services you need to include the follow environment variables when you start the application
159152

160-
- Generating Colormaps
153+
USE_IMAGE_CACHE=true SKIP_SQS=false npm start
161154

162-
- Scale images
155+
Or run
163156

164-
Scaling thumbnail images utilizes a redis cache in the deployed environment. To utilize this cache locally you'll need to install Redis on the dev machine. The easiest way to do this would be by running it in a docker container using the command `npm run start:cache`. You can also use a visualizer such as `RedisInsight` to more easily inspect the cache. You will also need to set the environment variable `USE_CACHE` locally to `true` with `export USE_CACHE=true` or add the environment variable to your shell script. To stop the docker container use the `npm run stop:cache` command.
157+
npm run start:optionals
165158

166-
#### Running API Gateway and Lambda Locally
159+
### Building the Application
167160

168-
Running the following command will spin up API Gateway and Lambda locally which will open up a vast majority of the functionality the backend offers.
161+
The production build of the application will be output in the `/static/dist/` directory:
169162

170-
npm run offline
163+
npm run build
171164

172-
This will provide access to API Gateway at [http://localhost:3001](http://localhost:3001)
165+
This production build can be run locally with any number of http-server solutions. A simple one is to use the http-server package
173166

174-
Additionally, this ties in with `esbuild` which will ensure that your lambdas are re-built when changes are detected.
167+
npx http-server static/dist
175168

176169
### Invoking lambdas locally
177170

178171
To invoke lambdas locally we must create a stringified JSON file with the order information to the specific lambda we are trying to run the structure of the events will differ between the lambda. Typically this will include data from your local database instance which is used in the event information.
179172

180-
npm run invoke-local -- --function <name-of-lambda-function> --path ./event.json
181-
182-
You may need to also set the `IS_OFFLINE` environment variable when invoking the lambda locally
183-
184-
export IS_OFFLINE=true
173+
npm run invoke-local <name-of-lambda-function> ./path/to/event.json
185174

186175
### Run the Automated [Jest](https://jestjs.io/) tests
187176

188177
Once the project is built, you must ensure that the automated unit tests pass:
189178

190179
npm run test
191180

192-
To get coverage on modules run
181+
To run Jest in `watch` mode
182+
183+
npm run test:watch
184+
185+
To only get coverage on files tested run
186+
193187
npm run test:watch-lite
194188

195-
test coverage will be updated in the coverage directory to see breakdown use
189+
Test coverage will be updated in the coverage directory to see breakdown use
190+
196191
open coverage/lcov-report/index.html
197192

198193
### Deployment
@@ -201,6 +196,7 @@ When the time comes to deploy the application, first ensure that you have the re
201196

202197
- AWS_ACCESS_KEY_ID
203198
- AWS_SECRET_ACCESS_KEY
199+
- STAGE_NAME
204200

205201
This application runs in a VPC for NASA security purposes, therefore the following values are expected when a deployment occurs:
206202

@@ -216,4 +212,6 @@ For production use, this application uses Scatter Swap to obfuscate some IDs --
216212

217213
To deploy the full application use the following:
218214

219-
NODE_ENV=production serverless deploy --stage UNIQUE_STAGE
215+
bin/deploy_bamboo.sh
216+
217+
Note: In that script all the env variables are prefixed with `bamboo_` to match our deployments.

api.nodemon.json

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
{
2+
"restartable": "rs",
3+
"ignore": [
4+
".git",
5+
"node_modules/**/node_modules",
6+
"static",
7+
"*.test.*"
8+
],
9+
"watch": [
10+
"*.config.json",
11+
"serverless/dist"
12+
],
13+
"delay": 2000
14+
}

babel.config.json

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,8 +5,7 @@
55
{
66
"targets": {
77
"node": "18",
8-
"esmodules": true,
9-
"ie": "11"
8+
"esmodules": true
109
}
1110
}
1211
],

0 commit comments

Comments
 (0)