Skip to content
Merged
Show file tree
Hide file tree
Changes from 9 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 14 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,24 +9,28 @@ This package can be used in combination with [ProximalOperators.jl](https://gith

[StructuredOptimization.jl](https://github.com/kul-forbes/StructuredOptimization.jl) provides a higher-level interface to formulate and solve problems using (some of) the algorithms here included.

### Installation
### Quick start

To install the package, simply issue the following command in the Julia REPL:

```julia
julia> Pkg.add("ProximalAlgorithms")
] add ProximalAlgorithms
```

Check out [these test scripts](test/problems) for examples on how to apply
the provided algorithms to problems.

### Implemented Algorithms

Algorithm | Function | Reference
--------------------------------------|---------------|-----------
Douglas-Rachford splitting algorithm | [`douglasrachford`](src/algorithms/douglasrachford.jl) | [[1]][eckstein_1989]
Forward-backward splitting (i.e. proximal gradient) algorithm | [`forwardbackward`](src/algorithms/forwardbackward.jl) | [[2]][tseng_2008], [[3]][beck_2009]
Chambolle-Pock primal dual algorithm | [`chambollepock`](src/algorithms/primaldual.jl) | [[4]][chambolle_2011]
Vũ-Condat primal-dual algorithm | [`vucondat`](src/algorithms/primaldual.jl) | [[6]][vu_2013], [[7]][condat_2013]
Davis-Yin splitting algorithm | [`davisyin`](src/algorithms/davisyin.jl) | [[9]][davis_2017]
Asymmetric forward-backward-adjoint algorithm | [`afba`](src/algorithms/primaldual.jl) | [[10]][latafat_2017]
PANOC (L-BFGS) | [`panoc`](src/algorithms/panoc.jl) | [[11]][stella_2017]
ZeroFPR (L-BFGS) | [`zerofpr`](src/algorithms/zerofpr.jl) | [[12]][themelis_2018]
Douglas-Rachford splitting algorithm | [`DouglasRachford`](src/algorithms/douglasrachford.jl) | [[1]][eckstein_1989]
Forward-backward splitting (i.e. proximal gradient) algorithm | [`ForwardBackward`](src/algorithms/forwardbackward.jl) | [[2]][tseng_2008], [[3]][beck_2009]
Vũ-Condat primal-dual algorithm | [`VuCondat`](src/algorithms/primaldual.jl) | [[4]][chambolle_2011], [[6]][vu_2013], [[7]][condat_2013]
Davis-Yin splitting algorithm | [`DavisYin`](src/algorithms/davisyin.jl) | [[9]][davis_2017]
Asymmetric forward-backward-adjoint algorithm | [`AFBA`](src/algorithms/primaldual.jl) | [[10]][latafat_2017]
PANOC (L-BFGS) | [`PANOC`](src/algorithms/panoc.jl) | [[11]][stella_2017]
ZeroFPR (L-BFGS) | [`ZeroFPR`](src/algorithms/zerofpr.jl) | [[12]][themelis_2018]

### Contributing

Expand Down
114 changes: 70 additions & 44 deletions src/algorithms/davisyin.jl
Original file line number Diff line number Diff line change
@@ -1,10 +1,6 @@
################################################################################
# Davis-Yin splitting iterable
#
# See:
# [1] Davis, Yin "A Three-Operator Splitting Scheme and its Optimization Applications",
# Set-Valued and Variational Analysis, vol. 25, no. 4, pp 829–858 (2017).
#
# Davis, Yin. "A Three-Operator Splitting Scheme and its Optimization
# Applications", Set-Valued and Variational Analysis, vol. 25, no. 4,
# pp. 829–858 (2017).

using Base.Iterators
using ProximalAlgorithms.IterationTools
Expand Down Expand Up @@ -64,60 +60,90 @@ function Base.iterate(iter::DYS_iterable, state::DYS_state)
return state, state
end

# Solver

struct DavisYin{R}
gamma::Maybe{R}
lambda::R
maxit::Int
tol::R
verbose::Bool
freq::Int

function DavisYin{R}(; gamma::Maybe{R}=nothing, lambda::R=R(1.0),
maxit::Int=10000, tol::R=R(1e-8), verbose::Bool=false, freq::Int=100
) where R
@assert gamma === nothing || gamma > 0
@assert lambda > 0
@assert maxit > 0
@assert tol > 0
@assert freq > 0
new(gamma, lambda, maxit, tol, verbose, freq)
end
end

function (solver::DavisYin{R})(x0::AbstractArray{C};
f=Zero(), g=Zero(), h=Zero(), A=I, L::Maybe{R}=nothing
) where {R, C <: Union{R, Complex{R}}}

stop(state::DYS_state) = norm(state.res, Inf) <= solver.tol
disp((it, state)) = @printf("%5d | %.3e\n", it, norm(state.res, Inf))

if solver.gamma === nothing
if L !== nothing
gamma = R(1)/L
else
error("You must specify either L or gamma")
end
else
gamma = solver.gamma
end

iter = DYS_iterable(f, g, h, A, x0, gamma, solver.lambda)
iter = take(halt(iter, stop), solver.maxit)
iter = enumerate(iter)
if solver.verbose iter = tee(sample(iter, solver.freq), disp) end

num_iters, state_final = loop(iter)

return state_final.xf, state_final.xg, num_iters

end

# Outer constructors

"""
davisyin(x0; f, g, h, A, [...])
DavisYin([gamma, lambda, maxit, tol, verbose, freq])

Solves convex optimization problems of the form
Instantiate the Davis-Yin splitting algorithm (see [1]) for solving
convex optimization problems of the form

minimize f(x) + g(x) + h(A x),

where `h` is smooth and `A` is a linear mapping, using the Davis-Yin splitting
algorithm, see [1].

Either of the following arguments must be specified:
where `h` is smooth and `A` is a linear mapping (for example, a matrix).
If `solver = DavisYin(args...)`, then the above problem is solved with

* `L::Real`, Lipschitz constant of the gradient of `h(A x)`.
* `gamma:Real`, stepsize parameter.
solver(x0; [f, g, h, A])

Other optional keyword arguments:
Optional keyword arguments:

* `gamma::Real` (default: `nothing`), stepsize parameter.
* `labmda::Real` (default: `1.0`), relaxation parameter, see [1].
* `maxit::Integer` (default: `1000`), maximum number of iterations to perform.
* `tol::Real` (default: `1e-8`), absolute tolerance on the fixed-point residual.
* `verbose::Bool` (default: `true`), whether or not to print information during the iterations.
* `freq::Integer` (default: `100`), frequency of verbosity.

If `gamma` is not specified at construction time, the following keyword
argument must be specified at solve time:

* `L::Real`, Lipschitz constant of the gradient of `h(A x)`.

References:

[1] Davis, Yin. "A Three-Operator Splitting Scheme and its Optimization
Applications", Set-Valued and Variational Analysis, vol. 25, no. 4,
pp. 829–858 (2017).
"""
function davisyin(x0;
f=Zero(), g=Zero(), h=Zero(), A=I,
lambda=1.0, L=nothing, gamma=nothing,
maxit=10_000, tol=1e-8,
verbose=false, freq=100)

R = real(eltype(x0))

stop(state::DYS_state) = norm(state.res, Inf) <= R(tol)
disp((it, state)) = @printf("%5d | %.3e\n", it, norm(state.res, Inf))

if gamma === nothing
if L !== nothing
gamma = R(1)/R(L)
else
error("You must specify either L or gamma")
end
end

iter = DYS_iterable(f, g, h, A, x0, R(gamma), R(lambda))
iter = take(halt(iter, stop), maxit)
iter = enumerate(iter)
if verbose iter = tee(sample(iter, freq), disp) end

num_iters, state_final = loop(iter)

return state_final.xf, state_final.xg, num_iters
end
DavisYin(::Type{R}; kwargs...) where R = DavisYin{R}(; kwargs...)
DavisYin(; kwargs...) = DavisYin(Float64; kwargs...)
87 changes: 55 additions & 32 deletions src/algorithms/douglasrachford.jl
Original file line number Diff line number Diff line change
@@ -1,10 +1,6 @@
################################################################################
# Douglas-Rachford splitting iterable
#
# [1] Eckstein, Bertsekas "On the Douglas-Rachford Splitting Method and the
# Proximal Point Algorithm for Maximal Monotone Operators*",
# Eckstein, Bertsekas, "On the Douglas-Rachford Splitting Method and the
# Proximal Point Algorithm for Maximal Monotone Operators",
# Mathematical Programming, vol. 55, no. 1, pp. 293-318 (1989).
#

using Base.Iterators
using ProximalAlgorithms: LBFGS
Expand Down Expand Up @@ -39,16 +35,61 @@ function Base.iterate(iter::DRS_iterable, state::DRS_state=DRS_state(iter))
return state, state
end

# Solver

struct DouglasRachford{R}
gamma::R
maxit::Int
tol::R
verbose::Bool
freq::Int

function DouglasRachford{R}(; gamma::R, maxit::Int=1000, tol::R=R(1e-8),
verbose::Bool=false, freq::Int=100
) where R
@assert gamma > 0
@assert maxit > 0
@assert tol > 0
@assert freq > 0
new(gamma, maxit, tol, verbose, freq)
end
end

function (solver::DouglasRachford{R})(
x0::AbstractArray{C}; f=Zero(), g=Zero()
) where {R, C <: Union{R, Complex{R}}}

stop(state::DRS_state) = norm(state.res, Inf) <= solver.tol
disp((it, state)) = @printf("%5d | %.3e\n", it, norm(state.res, Inf))

iter = DRS_iterable(f, g, x0, solver.gamma)
iter = take(halt(iter, stop), solver.maxit)
iter = enumerate(iter)
if solver.verbose iter = tee(sample(iter, solver.freq), disp) end

num_iters, state_final = loop(iter)

return state_final.y, state_final.z, num_iters

end

# Outer constructors

"""
douglasrachford(x0; f, g, gamma, [...])
DouglasRachford([gamma, maxit, tol, verbose, freq])

Instantiate the Douglas-Rachford splitting algorithm (see [1]) for solving
convex optimization problems of the form

minimize f(x) + g(x),

If `solver = DouglasRachford(args...)`, then the above problem is solved with

Minimizes `f(x) + g(x)` with respect to `x`, using the Douglas-Rachfor splitting
algorithm starting from `x0`, with stepsize `gamma`.
If unspecified, `f` and `g` default to the identically zero function,
while `gamma` defaults to one.
solver(x0, [f, g])

Other optional keyword arguments:
Optional keyword arguments:

* `gamma::Real` (default: `1.0`), stepsize parameter.
* `maxit::Integer` (default: `1000`), maximum number of iterations to perform.
* `tol::Real` (default: `1e-8`), absolute tolerance on the fixed-point residual.
* `verbose::Bool` (default: `true`), whether or not to print information during the iterations.
Expand All @@ -60,23 +101,5 @@ References:
Proximal Point Algorithm for Maximal Monotone Operators",
Mathematical Programming, vol. 55, no. 1, pp. 293-318 (1989).
"""
function douglasrachford(x0;
f=Zero(), g=Zero(),
gamma=1.0,
maxit=1000, tol=1e-8,
verbose=false, freq=100)

R = real(eltype(x0))

stop(state::DRS_state) = norm(state.res, Inf) <= R(tol)
disp((it, state)) = @printf("%5d | %.3e\n", it, norm(state.res, Inf))

iter = DRS_iterable(f, g, x0, R(gamma))
iter = take(halt(iter, stop), maxit)
iter = enumerate(iter)
if verbose iter = tee(sample(iter, freq), disp) end

num_iters, state_final = loop(iter)

return state_final.y, state_final.z, num_iters
end
DouglasRachford(::Type{R}; kwargs...) where R = DouglasRachford{R}(; kwargs...)
DouglasRachford(; kwargs...) = DouglasRachford(Float64; kwargs...)
Loading