You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Earthdata Search is a web application developed by [NASA](http://nasa.gov)[EOSDIS](https://earthdata.nasa.gov) to enable data discovery, search, comparison, visualization, and access across EOSDIS' Earth Science data holdings.
11
-
It builds upon several public-facing services provided by EOSDIS, including the [Common Metadata Repository (CMR)](https://cmr.earthdata.nasa.gov/search/) for data discovery and access, EOSDIS [User Registration System (URS)](https://urs.earthdata.nasa.gov) authentication, the [Global Imagery Browse Services (GIBS)](https://earthdata.nasa.gov/gibs) for visualization, and a number of OPeNDAP services hosted by data providers.
10
+
It builds upon several public-facing services provided by EOSDIS, including the [Common Metadata Repository (CMR)](https://cmr.earthdata.nasa.gov/search/) for data discovery and access, EOSDIS [Earthdata Login (EDL)](https://urs.earthdata.nasa.gov) authentication, the [Global Imagery Browse Services (GIBS)](https://earthdata.nasa.gov/gibs) for visualization, and a number of OPeNDAP services hosted by data providers.
12
11
13
12
## License
14
13
@@ -24,43 +23,26 @@ It builds upon several public-facing services provided by EOSDIS, including the
24
23
25
24
## Application Installation and Usage
26
25
27
-
The Earthdata Search application uses Node v18 and Vite 5 to generate static assets. The serverless application utilizes the following AWS services (important to note if deploying to an AWS environment):
26
+
The Earthdata Search application uses NodeJS and Vite to generate static assets. The serverless application utilizes the following AWS services (important to note if deploying to an AWS environment):
28
27
29
28
- S3
30
29
- We highly recommend using CloudFront in front of S3.
31
30
- SQS
31
+
- Step Functions
32
32
- API Gateway
33
33
- Lambda
34
34
- Cloudwatch (Events)
35
35
36
36
### Prerequisites
37
37
38
-
##### Node
38
+
####NodeJS
39
39
40
-
Earthdata Search runs on Node.js, in order to run the application you'll need to [install it](https://nodejs.org/en/download/).
40
+
We recommend using [Node Version Manager](https://github.com/nvm-sh/nvm?tab=readme-ov-file#installing-and-updating) (NVM) to manage your NodeJS install. Use the shell integration to [automatically switch Node versions](https://github.com/nvm-sh/nvm?tab=readme-ov-file#calling-nvm-use-automatically-in-a-directory-with-a-nvmrc-file).
41
41
42
-
**Recommended:** Use Homebrew
43
-
44
-
brew install node
45
-
46
-
##### NPM
47
-
48
-
npm is a separate project from Node.js, and tends to update more frequently. As a result, even if you’ve just downloaded Node.js (and therefore npm), you’ll probably need to update your npm. Luckily, npm knows how to update itself! To update your npm, type this into your terminal:
49
-
50
-
npm install -g npm@latest
51
-
52
-
##### NVM
53
-
54
-
To ensure that you're using the correct version of Node it is recommended that you use Node Version Manager. Installation instructions can be found on [the repository](https://github.com/nvm-sh/nvm#install--update-script). The version used is defined in .nvmrc and will be used automatically if NVM is configured correctly. Using nvm we can switch node versions to the one utilized by Earthdata Search. From the top-level directory:
42
+
NVM will automatically install the correct node version defined in `.nvmrc`
55
43
56
44
nvm use
57
45
58
-
##### Running Serverless Framework locally
59
-
60
-
Earthdata Search utilizes the [Serverless Framework](https://serverless.com/) for managing AWS resources. In order to fully run and manage the application you'll need to install it:
61
-
62
-
npm install -g serverless@latest
63
-
64
46
##### PostgreSQL
65
47
66
48
Earthdata Search uses PostgreSQL in production on AWS RDS. If you don't already have it installed, [download](https://www.postgresql.org/download/) and install it to your development environment.
@@ -81,6 +63,30 @@ If you decide to install via Homebrew you'll need to create the default user.
81
63
82
64
createuser -s postgres
83
65
66
+
##### Docker
67
+
68
+
In order to simulate S3 locally we use [Minio](https://min.io/docs/minio/container/index.html) within a docker container.
69
+
70
+
##### Docker, Optional
71
+
72
+
In order to simulate SQS locally we use [ElasticMQ](https://github.com/softwaremill/elasticmq) within a docker container.
73
+
74
+
##### Redis, Optional
75
+
76
+
To use an image cache you need to have Redis installed.
77
+
78
+
**Recommended:** Use Homebrew
79
+
80
+
brew install redis
81
+
82
+
Optionally you can run Redis in a Docker container with
83
+
84
+
npm run start:cache
85
+
86
+
To stop the Redis Docker container
87
+
88
+
npm run stop:cache
89
+
84
90
### Initial Setup
85
91
86
92
##### Package Installation
@@ -117,82 +123,71 @@ Ensure that you have a database created:
117
123
118
124
To run the migrations locally:
119
125
120
-
DATABASE_URL=postgresql://USERNAME:PASSWORD@localhost:5432/edsc_dev npm run migrate up
121
-
122
-
Optionally, we can run the migration locally and not within a deployed Lambda. When deployed our database migrations run within Lambda due to the fact that in non-development environments our resources are not publicly accessible. To run the migrations you'll need to invoke the Lambda:
123
-
124
-
serverless invoke local --function migrateDatabase
126
+
npm run invoke-local migrateDatabase
125
127
126
128
###### Creating a new database migration
127
129
128
130
To create a new database migration use this command to ensure the migration follow the same timestamp name scheme.
129
131
130
132
npm run migrate create name-of-migration
131
133
132
-
### Building the Application
133
-
134
-
The production build of the application will be output in the `/static/dist/` directory:
135
-
136
-
npm run build
137
-
138
-
This production build can be run locally with any number of http-server solutions. A simple one is to use the http-server package
139
-
140
-
npx http-server static/dist
141
-
142
134
### Run the Application Locally
143
135
144
136
The local development environment for the static assets can be started by executing the command below in the project root directory:
145
137
146
-
npm run start
138
+
npm start
147
139
148
-
This will run the React application at [http://localhost:8080](http://localhost:8080) -- please see `Serverless Framework` below for enabling the 'server' side functionality.
140
+
This will start everything you need to run Earthdata Search locally.
- Mock API Gateway: [http://localhost:3001](http://localhost:3001)
144
+
- Watch for code changes to the `serverless` directory
145
+
- ElasticMQ container for SQS Queues.
146
+
- Mock SQS service to trigger lambdas on SQS messages.
147
+
- Mock S3 service for generating notebooks.
151
148
152
-
The [serverless framework](https://serverless.com/framework/docs/providers/aws/) offers many plugins which allow for local development utilizing many of the services AWS offers. For the most part we only need API Gateway and Lambda for this application but there are plugins for many more services (a list of known exceptions will be maintained below).
153
-
154
-
##### Exceptions
155
-
156
-
- SQS
149
+
#### Optional Services
157
150
158
-
While there is an sqs-offline plugin for serverless it still requires an actual queue be running, we may investigate this in the future but for now sqs functionality isn't available while developing locally which means the following pieces of functionality will not operate locally:
151
+
By default we don't run SQS or an image cache locally. In order to run the application with those services you need to include the follow environment variables when you start the application
159
152
160
-
- Generating Colormaps
153
+
USE_IMAGE_CACHE=true SKIP_SQS=false npm start
161
154
162
-
- Scale images
155
+
Or run
163
156
164
-
Scaling thumbnail images utilizes a redis cache in the deployed environment. To utilize this cache locally you'll need to install Redis on the dev machine. The easiest way to do this would be by running it in a docker container using the command `npm run start:cache`. You can also use a visualizer such as `RedisInsight` to more easily inspect the cache. You will also need to set the environment variable `USE_CACHE` locally to `true` with `export USE_CACHE=true` or add the environment variable to your shell script. To stop the docker container use the `npm run stop:cache` command.
157
+
npm run start:optionals
165
158
166
-
#### Running API Gateway and Lambda Locally
159
+
###Building the Application
167
160
168
-
Running the following command will spin up API Gateway and Lambda locally which will open up a vast majority of the functionality the backend offers.
161
+
The production build of the application will be output in the `/static/dist/` directory:
169
162
170
-
npm run offline
163
+
npm run build
171
164
172
-
This will provide access to API Gateway at [http://localhost:3001](http://localhost:3001)
165
+
This production build can be run locally with any number of http-server solutions. A simple one is to use the http-server package
173
166
174
-
Additionally, this ties in with `esbuild` which will ensure that your lambdas are re-built when changes are detected.
167
+
npx http-server static/dist
175
168
176
169
### Invoking lambdas locally
177
170
178
171
To invoke lambdas locally we must create a stringified JSON file with the order information to the specific lambda we are trying to run the structure of the events will differ between the lambda. Typically this will include data from your local database instance which is used in the event information.
179
172
180
-
npm run invoke-local -- --function <name-of-lambda-function> --path ./event.json
181
-
182
-
You may need to also set the `IS_OFFLINE` environment variable when invoking the lambda locally
183
-
184
-
export IS_OFFLINE=true
173
+
npm run invoke-local <name-of-lambda-function> ./path/to/event.json
185
174
186
175
### Run the Automated [Jest](https://jestjs.io/) tests
187
176
188
177
Once the project is built, you must ensure that the automated unit tests pass:
189
178
190
179
npm run test
191
180
192
-
To get coverage on modules run
181
+
To run Jest in `watch` mode
182
+
183
+
npm run test:watch
184
+
185
+
To only get coverage on files tested run
186
+
193
187
npm run test:watch-lite
194
188
195
-
test coverage will be updated in the coverage directory to see breakdown use
189
+
Test coverage will be updated in the coverage directory to see breakdown use
190
+
196
191
open coverage/lcov-report/index.html
197
192
198
193
### Deployment
@@ -201,6 +196,7 @@ When the time comes to deploy the application, first ensure that you have the re
201
196
202
197
- AWS_ACCESS_KEY_ID
203
198
- AWS_SECRET_ACCESS_KEY
199
+
- STAGE_NAME
204
200
205
201
This application runs in a VPC for NASA security purposes, therefore the following values are expected when a deployment occurs:
206
202
@@ -216,4 +212,6 @@ For production use, this application uses Scatter Swap to obfuscate some IDs --
0 commit comments