Draft
Conversation
Layer the DifferentiationInterface AD extension onto the simplified evaluator
interface introduced on the `evaluators` branch. This is the catch-all path
for any `<:AbstractADType` backend without a native AbstractPPL extension.
- ext/AbstractPPLDifferentiationInterfaceExt.jl: DI-backed `prepare`,
`value_and_gradient!!`, and `value_and_jacobian!!` for vector-input
evaluators. The evaluator is passed as a `DI.Constant` so DynamicPPL-style
problem state stays constant across calls. Compiled `AutoReverseDiff{true}`
takes the one-argument tape path (the `DI.Constant` route would invalidate
the compiled tape across calls). Empty inputs short-circuit before
`DI.prepare_*` since many backends fail on length-zero arrays.
- Project.toml: DifferentiationInterface added as a weakdep with extension
trigger and compat bound.
- test/autograd_tests.jl: shared problem definitions and `run_autograd_tests`
entry point with separate gradient / jacobian / empty-input helpers.
- test/ext/differentiationinterface/: isolated test env using a local
`DummyADType <: AbstractADType` to exercise the catch-all dispatch without
pulling in a real AD package.
- .github/workflows/CI.yml: extend the ext matrix with
`ext/differentiationinterface` alongside the existing
`ext/logdensityproblems`.
- test/run_extras.jl: register the new label.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
The upstream `Simplify Evaluators module after review` commit dropped `_assert_supported_output` / `_assert_jacobian_output` from `Evaluators.jl`, expecting the AD-extension PRs that actually call them to bring them back. Move both helpers into the DI extension, the sole caller after rebase. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Replace `_assert_supported_output` / `_assert_jacobian_output` with an `if y isa Number / elseif y isa AbstractVector / else throw` cascade. The jacobian assertion was unreachable after the scalar branch, and the supported-output helper had a single call site; inlining the remaining check as the `else` arm is clearer than two named helpers. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Expose shared AD backend fixtures through TestResources so extension test environments can reuse the same cases without including files from test/. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
Exercise the DifferentiationInterface catch-all with a real backend instead of a stub so the extension test covers the integration path used by downstream AD packages. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
… extension Replace `prepare(adtype, problem, x)`'s two-step `prepare` + `VectorEvaluator` wrap with a single delegated call to `AbstractPPL.prepare(problem, x; check_dims)`, eliminating a latent double-wrap when `check_dims=false`. Move `TestResources` (callable problems, test cases, runner) out of `src/` and into a new `AbstractPPLTestExt` package extension, so `Test` is loaded only when downstream code opts in. Split the unified `TestCase` into `ValueCase` / `ErrorCase`, drop the unused `:namedtuple` fixtures, and store the error operation as a callable instead of a `:call`/`:gradient` symbol. Flatten the DI extension test file to two `run_testcases(Val(...); ...)` calls. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- value_and_gradient!! / value_and_jacobian!! now check arity before the
length-zero short-circuit, so empty-input vector-valued / scalar-valued
functions surface the correct ArgumentError instead of silently returning
the wrong shape. Uses Val(0) as the cache sentinel for empty inputs.
- New AbstractPPLDifferentiationInterfaceLogDensityProblemsExt advertises
LogDensityOrder{1} for any DI-prepared evaluator, registering the method
in __init__ to side-step extension precompile-visibility constraints.
- Empty-input arity errors and the new capability advertisement are
exercised in the DI test environment, which now also loads LDP.
- Project.toml [compat] alphabetised.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Drop the triple extension and have AbstractPPLLogDensityProblemsExt
advertise LogDensityOrder{1} for any Prepared{<:Any,<:VectorEvaluator}.
The contract is now uniform: anything from `prepare(adtype, …)` claims
gradient capability; a bare VectorEvaluator from `prepare(problem, x)`
stays at order 0. Backends that don't implement value_and_gradient!! or
that wrap a vector-output function surface a runtime error at call time
rather than via a capability downgrade.
Documents the rule in docs/src/evaluators.md.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Layer the native Mooncake AD extension onto the simplified evaluator
interface, covering both `AutoMooncake` (reverse) and `AutoMooncakeForward`
(forward) for vector and NamedTuple inputs.
- ext/AbstractPPLMooncakeExt.jl: native Mooncake path. Reverse mode uses
`prepare_gradient_cache` / `prepare_pullback_cache`; forward mode uses
`prepare_derivative_cache` for both gradient and jacobian. The Mooncake
cache is stored directly on `Prepared`; the empty-input case is tagged
with `nothing` and dispatches to a length-zero short-circuit so we don't
try to build a tape on an empty array.
- Project.toml: Mooncake added as a weakdep with extension trigger and
compat bound `0.5.27`.
- test/autograd_tests.jl: add `run_shared_namedtuple_tests` and a
`namedtuple` opt-in on `run_autograd_tests` for backends with a native
NamedTuple-input path. Relax the cross-call assertion in
`run_shared_jacobian_tests` to `@test_throws Exception` since DI raises
`ArgumentError("scalar-valued ...")` while Mooncake raises its own
`ValueAndGradientReturnTypeError`.
- test/ext/mooncake/: isolated test env exercising both `AutoMooncake` and
`AutoMooncakeForward` through the shared autograd helpers, including
the NamedTuple gradient path.
- .github/workflows/CI.yml: extend the ext matrix with `ext/mooncake`.
- test/run_extras.jl: register the new label.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Use labeled testcase construction and local runners so extension tests stay explicit while avoiding repeated assertion details. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
Mooncake's empty-input `prepare` previously stored a `nothing` cache and the
empty-input `value_and_{gradient,jacobian}!!` methods returned without
checking output arity, so calling `value_and_gradient!!` on a vector-valued
function (or `value_and_jacobian!!` on a scalar-valued one) silently
succeeded — unlike the DI extension, which raises `"requires a
scalar-valued function"` / `"requires a vector-valued function"`.
Detect output arity at prepare time and tag the empty-input cache with
`Val(:scalar)` / `Val(:vector)`. Add dispatch methods on those markers so
the empty-input call sites raise the same errors DI raises.
Lift the duplicated `y isa Union{Number,AbstractVector} || throw(...)`
check from both extensions into `Evaluators._ad_output_arity`, which
returns the `:scalar`/`:vector` symbol both extensions then dispatch on.
Restore the NamedTuple test group as a separate `Val(:namedtuple)` in
`AbstractPPLTestExt` (one `ValueCase` plus one `ErrorCase`) and wire it
into the Mooncake test runner.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Two fixes to the Mooncake extension's `prepare`/dispatch path:
1. Arity dispatch covers the non-empty path too. Previously, calling
`value_and_gradient!!` on a vector-valued (Mooncake-prepared) function —
or `value_and_jacobian!!` on a scalar-valued one — fell through to
`Mooncake.value_and_gradient!!(jacobian_cache, ...)` and crashed inside
Mooncake with an opaque error. Wrap every cache (empty or not) in a
`MooncakeCache{A,C}` tagged with the detected output arity so both empty
and non-empty mismatches raise the same `"requires a scalar-valued
function"` / `"requires a vector-valued function"` errors DI raises.
2. Thread `check_dims` into the inner prepare. The previous code called
`AbstractPPL.prepare(problem, x)` (defaulting to `check_dims=true`) and
then wrapped the result in another `VectorEvaluator{check_dims}`. The
double-wrap meant the inner evaluator always validated, so
`prepare(adtype, problem, x; check_dims=false)` still threw shape errors
contrary to the documented contract. Mirror DI's pattern by calling
`AbstractPPL.prepare(problem, x; check_dims)` once and using the result
directly.
Tighten the existing `:edge` "gradient of vector-valued output" test from
`Exception` to `r"scalar-valued"` and add a parallel non-empty
"jacobian of scalar output" case so the new arity dispatch is covered for
both DI and Mooncake.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…tras Aqua's persistent_tasks check spawns a wrapper subprocess that runs Pkg.precompile() on a package depending on AbstractPPL. On Julia 1.10 this hits "Declaring __precompile__(false) is not allowed in files that are being precompiled" inside the wrapper's extension precompile path. The dedicated Ext CI jobs load and exercise every extension on min Julia and pass, so this is a Julia 1.10 / Aqua interaction, not a defect in our extensions. Re-enable when min is bumped past 1.10. The CI extras step also had a redundant --project=. that was overridden by run_extras.jl's own Pkg.activate; drop it. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- Mark generate_testcases / run_testcases as public (1.11+) so downstream AD-backend packages can reuse the conformance suite without reaching into private API. - Note in the Prepared docstring that the two-arg constructor is for backends that allocate fresh storage per call. - Expand the _ad_output_arity error message to call out matrix/tuple outputs explicitly and recommend flattening. - Add a MethodError hint on value_and_gradient!! / value_and_jacobian!! that fires when no AD backend extension is loaded — also surfaces through LogDensityProblems.logdensity_and_gradient, which delegates to value_and_gradient!!. Suppressed once any backend registers a method, where the standard candidate list is more informative. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Contributor
|
AbstractPPL.jl documentation for PR #160 is available at: |
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## main #160 +/- ##
===========================================
- Coverage 85.24% 70.82% -14.43%
===========================================
Files 13 16 +3
Lines 705 850 +145
===========================================
+ Hits 601 602 +1
- Misses 104 248 +144 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
No description provided.