Skip to content

Commit 34dc02f

Browse files
committed
Add publications for 2024
1 parent 2fb1abd commit 34dc02f

File tree

1 file changed

+61
-0
lines changed

1 file changed

+61
-0
lines changed

_data/publist.yml

Lines changed: 61 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -470,3 +470,64 @@
470470
link: /publications/fast-and-automatic-floating-point-error-analysis-with-chef-fp
471471
volume: '608'
472472
year: '2023'
473+
474+
- title: Performance Portable Gradient Computations Using Source Transformation
475+
author: Kim Liegeois, Brian Kelley, Eric Phipps, Sivasankaran Rajamanickam and
476+
Vassil Vassilev
477+
abstract: |
478+
Derivative computation is a key component of optimization, sensitivity
479+
analysis, uncertainty quantification, and nonlinear solvers. Automatic
480+
differentiation (AD) is a powerful technique for evaluating such
481+
derivatives, and in recent years, has been integrated into programming
482+
environments such as Jax, PyTorch, and TensorFlow to support derivative
483+
computations needed for training of machine learning models, resulting in
484+
widespread use of these technologies. The C++ language has become the de
485+
facto standard for scientific computing due to numerous factors, yet
486+
language complexity has made the adoption of AD technologies for C++
487+
difficult, hampering the incorporation of powerful differentiable
488+
programming approaches into C++ scientific simulations. This is exacerbated
489+
by the increasing emergence of architectures such as GPUs, which have
490+
limited memory capabilities and require massive thread-level
491+
concurrency. Portable scientific codes rely on domain specific programming
492+
models such as Kokkos making AD for such codes even more complex.<br />
493+
In this paper, we will investigate source transformation-based automatic
494+
differentiation using Clad to automatically generate portable and efficient
495+
gradient computations of Kokkos-based code. We discuss the modifications of
496+
Clad required to differentiate Kokkos abstractions. We will illustrate the
497+
feasibility of our proposed strategy by comparing the wall-clock time of the
498+
generated gradient code with the wall-clock time of the input function on
499+
different cutting edge GPU architectures such as NVIDIA H100, AMD MI250x,
500+
and Intel Ponte Vecchio GPU. For these three architectures and for the
501+
considered example, evaluating up to 10,000 entries of the gradient only
502+
took up to 2.17 times the wall-clock time of evaluating the input function.
503+
cites: '0'
504+
eprint: 8th International Conference on Algorithmic Differentiation
505+
url: https://www.autodiff.org/ad24/
506+
year: '2024'
507+
508+
- title: Optimization Using Pathwise Algorithmic Derivatives of Electromagnetic
509+
Shower Simulations
510+
author: Max Aehle, Mihaly Novak, Vassil Vassilev, Nicolas R. Gauger,
511+
Lukas Heinrich, Michael Kagan and David Lange
512+
abstract: |
513+
Among the well-known methods to approximate derivatives of expectancies
514+
computed by Monte-Carlo simulations, averages of pathwise derivatives are
515+
often the easiest one to apply. Computing them via algorithmic
516+
differentiation typically does not require major manual analysis and
517+
rewriting of the code, even for very complex programs like simulations of
518+
particle-detector interactions in high-energy physics. However, the pathwise
519+
derivative estimator can be biased if there are discontinuities in the
520+
program, which may diminish its value for applications.<br />
521+
This work integrates algorithmic differentiation into the electromagnetic
522+
shower simulation code HepEmShow based on G4HepEm, allowing us to study how
523+
well pathwise derivatives approximate derivatives of energy depositions in a
524+
sampling calorimeter with respect to parameters of the beam and geometry. We
525+
found that when multiple scattering is disabled in the simulation, means of
526+
pathwise derivatives converge quickly to their expected values, and these
527+
are close to the actual derivatives of the energy deposition. Additionally,
528+
we demonstrate the applicability of this novel gradient estimator for
529+
stochastic gradient-based optimization in a model example.
530+
cites: '0'
531+
eprint: https://arxiv.org/pdf/2405.07944
532+
url: https://arxiv.org/pdf/2405.07944
533+
year: '2024'

0 commit comments

Comments
 (0)