You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- creates a node-red-net network and attaches the container to this network
213
214
- persists the `/data` dir inside the container to the `node-red-data` volume in Docker
214
215
216
+
217
+
### Dockerfile which copies in local resources
218
+
219
+
It can sometimes be useful to populate a Node-RED Docker image with files from a local directory (for example, if you want a whole project to be kept in a git repo). To do this, you'll want your local directory to look like this:
220
+
221
+
```
222
+
Dockerfile
223
+
README.md
224
+
flows.json # the normal place Node-RED store your flows
225
+
flows_cred.json # credemtials your flows may need
226
+
package.json # the non-standard modules your flows use
227
+
settings.js # the normal settings file
228
+
```
229
+
230
+
The following Dockerfile builds on the base Node-RED Docker image, but additionally moves your own files into place into that image:
231
+
232
+
```
233
+
FROM nodered/node-red
234
+
235
+
# Copy package.json to the WORKDIR so npm builds all
236
+
# of your added modules for Node-RED
237
+
COPY package.json .
238
+
RUN npm install --only=production
239
+
240
+
# Copy _your_ Node-RED project files into place
241
+
COPY settings.js /data/settings.js
242
+
COPY flows_cred.json /data/flows_cred.json
243
+
COPY flows.json /data/flows.json
244
+
245
+
# Start the container normally
246
+
CMD ["npm", "start"]
247
+
```
248
+
249
+
#### Dockerfile order and build speed
250
+
251
+
While not necessary, it's a good idea to do the `COPY package... npm install...` steps early because, although the `flows.json` changes frequently as you work in Node-RED, your `package.json` will only change when you change what modules are part of your project. And since the `npm install` step that needs to happen when `package.json` changes can sometimes be time consuming, it's better to do the time-consuming, generally-unchanging steps earlier in a Dockerfile so those build images can be reused, making subsequent overall builds much faster.
252
+
253
+
#### Credentials, secrets, and environment variables
254
+
255
+
Of course you never want to hard-code credentials anywhere, so if you need to use credentials with your Node-RED project, the above Dockerfile will let you have this in your `settings.js`...
256
+
257
+
```
258
+
module.exports = {
259
+
credentialSecret: process.env.NODE_RED_CREDENTIAL_SECRET // add exactly this
260
+
}
261
+
```
262
+
263
+
...and then when you run in Docker, you add an environment variable to your `run` command...
264
+
265
+
`docker run -e "NODE_RED_CREDENTIAL_SECRET=your_secret_goes_here"`
266
+
267
+
#### Building and running
268
+
269
+
You _build_ this Dockerfile normally:
270
+
271
+
```sh
272
+
docker build -t your-image-name:your-tag .
273
+
```
274
+
275
+
To _run_ locally for development where changes are written immediately and only the local directory that youre working from, `cd` into the project's directory and then run:
0 commit comments