Skip to content

Commit

Permalink
Merge pull request #72 from acfr/bugfix/docs-images
Browse files Browse the repository at this point in the history
Fixed bug and updated project version
  • Loading branch information
EliottEccidio authored May 26, 2023
2 parents d933739 + 529ef4f commit e2f5d1e
Show file tree
Hide file tree
Showing 4 changed files with 2 additions and 6 deletions.
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "RobustNeuralNetworks"
uuid = "a1f18e6b-8af1-433f-a85d-2e1ee636a2b8"
authors = ["Nicholas H. Barbara", "Max Revay", "Ruigang Wang", "Jing Cheng", "Jerome Justin", "Ian R. Manchester"]
version = "0.1.0"
version = "0.2.0"

[deps]
Flux = "587475ba-b771-5e3f-ad9e-33799f191a9c"
Expand Down
Binary file added docs/src/assets/ren.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 0 additions & 4 deletions docs/src/assets/ren.svg

This file was deleted.

2 changes: 1 addition & 1 deletion docs/src/introduction/package_overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ w_t=\sigma(&v_t):=\begin{bmatrix}
where ``v_t, w_t \in \mathbb{R}^{n_v}`` are the inputs and outputs of neurons and ``\sigma`` is the activation function. Graphically, this is equivalent to the following, where the linear (actually affine) system ``G`` represents the first equation above.

```@example
@html_str """<p align="center"> <object type="image/svg+xml" data=$(joinpath(Main.buildpath, "../assets/ren.svg")) width="35%"></object> </p>""" #hide
@html_str """<p align="center"> <object type="image/png" data=$(joinpath(Main.buildpath, "../assets/ren.png")) width="35%"></object> </p>""" #hide
```

A *Lipschitz-Bounded Deep Network* (LBDN) can be thought of as a specialisation of a REN with a state dimension of ``n_x = 0``. That is, LBDN models have no dynamics or memory associated with them. In reality, we use this simplification to construct LBDN models completely differently to RENs. We construct LBDNs as ``L``-layer feed-forward networks, much like [MLPs](https://en.wikipedia.org/wiki/Multilayer_perceptron) or [CNNs](https://en.wikipedia.org/wiki/Convolutional_neural_network), described by the following recursive equations.
Expand Down

0 comments on commit e2f5d1e

Please sign in to comment.