Skip to content

Conversation

@ax3l
Copy link
Member

@ax3l ax3l commented Mar 19, 2025

Add 2D Space Charge in Particle Tracking. First part of #401

To Do

For this to compile, you need to set:
cmake --fresh -S . -B build -DImpactX_FFT=ON -DImpactX_ablastr_branch=a76eecf1553f5d0e72ec948c9a67db4111a1544f (WarpX commit of the merged PR BLAST-WarpX/warpx#6201 )

Follow-up

@ax3l ax3l added the component: space charge Space charge & potential solver label Mar 19, 2025
@ax3l ax3l added this to the Advanced Methods (SciDAC-5) milestone Mar 19, 2025
@ax3l ax3l requested a review from WeiqunZhang March 19, 2025 22:38
// push momentum
px += field_interp[0] * push_consts;
py += field_interp[1] * push_consts;
pz += field_interp[2] * push_consts; // TODO: is this always += 0.0?
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Non-zero, because we will model a variation of the current density along t.

Might in practice often be neglected. We can start without long. kick and add it later.

@ax3l ax3l force-pushed the topic-space-charge-particles-2D branch 2 times, most recently from de299e6 to 7b53d47 Compare July 15, 2025 18:53
@ax3l ax3l requested a review from cemitch99 July 15, 2025 18:56
@ax3l ax3l force-pushed the topic-space-charge-particles-2D branch from 7b53d47 to 79b5d1c Compare August 28, 2025 21:19
@ax3l ax3l changed the title [WIP] 2.5D Space Charge in Particle Tracking [WIP] 2D Space Charge in Particle Tracking Aug 28, 2025
@cemitch99 cemitch99 changed the title [WIP] 2D Space Charge in Particle Tracking 2D Space Charge in Particle Tracking Aug 28, 2025
@cemitch99
Copy link
Member

The test here currently fails with the simple error:

domain size in direction 2 is 1
blocking_factor is 8
amrex::Error::0::domain size not divisible by blocking_factor !!!

We should make this error conditional on the space charge solver. This solver requires a grid size of 1 in z.

Copy link
Member

@cemitch99 cemitch99 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need to touch base on the scale/normalization of the fields returned by the 2D solve, in order to correctly set the value of push_constants in GatherAndPush.cpp.

RemiLehe pushed a commit to BLAST-WarpX/warpx that referenced this pull request Sep 9, 2025
Enable interpolating fields to particles in 1D or 2D even when running a
3D simulation. Needed for 2D and 2.5D space charge in ImpactX:
BLAST-ImpactX/impactx#909
@cemitch99
Copy link
Member

cemitch99 commented Sep 12, 2025

Here is a more interesting observation. I ran with an axisymmetric KV beam, expanding in free space, with 2D space charge for an identical number of grid points in x and y, but the envelopes show significantly different evolution. This is evidence of x/y asymmetry in the underlying 2D Poisson solve. The push constants need to be updated, but they are identical in x and y.
Expanding_beam_2D_envelopes.pdf

# Particle Beam(s)
beam.npart = 100000  # outside tests, use 1e5 or more
beam.units = static
beam.kin_energy = 250.0
beam.current = 1.0e-3
beam.particle = proton
beam.distribution = kvdist
beam.lambdaX = 4.472135955e-4
beam.lambdaY = beam.lambdaX
beam.lambdaT = 9.12241869e-7
beam.lambdaPx = 0.0
beam.lambdaPy = 0.0 
beam.lambdaPt = 0.0

# Beamline: lattice elements and segments
lattice.elements = monitor drift1 monitor
lattice.nslice = 40

drift1.type = drift
drift1.ds = 6.0

monitor.type = beam_monitor
monitor.backend = h5

# Algorithms
algo.particle_shape = 2
algo.space_charge = 2D
algo.poisson_solver = "fft"

amr.n_cell = 16 16 1
amr.blocking_factor_x = 8
amr.blocking_factor_y = 8
amr.blocking_factor_z = 1

geometry.prob_relative = 1.01

@ax3l ax3l mentioned this pull request Sep 15, 2025
1 task
Comment on lines 110 to 111
px += field_interp[0] * push_consts * beta_gamma * dr[2] / (c0_SI);
py += field_interp[1] * push_consts * beta_gamma * dr[2] / (c0_SI); //this should be field_interp[1]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The constants appearing in px and py are currently in draft form only, and will continue to be updated.

ax3l and others added 5 commits September 23, 2025 15:28
Keep simple for now.
Phi is then blown up again to 3D with indentical values in z.

Co-authored-by: Weiqun Zhang <[email protected]>
@cemitch99
Copy link
Member

cemitch99 commented Oct 20, 2025

@WeiqunZhang and @ax3l The test mentioned earlier today for this PR is located in: examples/expanding_beam/

C++ input file: input_expanding_fft_2D.in
Python input file: run_expanding_fft_2D.py

I have added a script for plotting the x-y beam size evolution called plot_expanding_fft_2D.py.

If the test works, the beam size at the end (at the maximum value of $s$ shown in the plot) should be 1 mm.

To see the convergence issue, I suggest running the Python test with npart = 1M for a few values of sim.blocking_factor_x and sim.blocking_factor_y. You can see the effect this has on the beam moments by running the plot script in each case.

@cemitch99
Copy link
Member

@WeiqunZhang and @ax3l Note: I tried the same test using 3D space charge (using examples/expanding_beam/run_expanding_fft.py after turning off mesh refinement), and this issue does not occur.

@cemitch99
Copy link
Member

cemitch99 commented Oct 30, 2025

@WeiqunZhang and @ax3l Here is the latest update: If I run the app example "input_expanding_fft_2D.in", then I always see that 1 grid is used, independently of the choice of blocking factor ("1 grids 1024 cells"), and the results are consistent:
ExpandingBeam_app_input
However, if I run the corresponding Python example "run_expanding_fft_2D.py", then as I change the blocking factor, the number of grids varies (bf = 32, grids = 1; bf = 16, grids = 4; bf = 8, grids = 16; bf = 4, grids = 64; bf = 2, grids = 256). The results are consistent for bf >= 8, but disagree for smaller bf. Here, the number of cells is [32,32,1] in each case.
ExpandingBeam_py_input

This issue has now been fixed. Thanks @WeiqunZhang !

@cemitch99
Copy link
Member

For this PR, we need to resolve the following error resulting in failing tests:

amrex::Abort::1::Particle container needs to have at least one grid. !!!
SIGABRT

Grids Summary:
Level 0 1 grids 1024 cells 100 % of domain

@cemitch99
Copy link
Member

Fixed.

@ax3l
Copy link
Member Author

ax3l commented Nov 7, 2025

@WeiqunZhang can you confirm what the last AMReX patch is that we need? Are there any newer than AMReX 25.11?

@ax3l
Copy link
Member Author

ax3l commented Nov 7, 2025

Slack:

I don't think that PR needs anything beyond 25.11.

@cemitch99
Copy link
Member

To further validate this PR, here is a figure illustrating the computed beam moments for the example input_fodo_2d_sc.in (modified to use 1 M particles and a [64,64] grid), compared against the result using envelope tracking. This is a high-current case, with strong asymmetry in X and Y.
FODO_2D_SpaceCharge_Example

Comment on lines +151 to +159
amrex::IntVect num_guards_phi{num_guards_rho + 1}; // todo: I think this just depends on max(MLMG, force calc)
amrex::BoxArray phi_ba = cba;
if (space_charge == SpaceChargeAlgo::True_2D) {
num_guards_phi[2] = 0;
amrex::BoxList bl(amrex::IndexType{rho_nodal_flag});
bl.reserve(cba.size());
for (int i = 0, N = cba.size(); i < N; ++i) {
amrex::Box b = cba[i];
b.convert(rho_nodal_flag).setRange(2,0); // Flatten box in z-direction
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not particularly proud of the logic here and in PoissonSolve on how and when we allocate 2D fields. But I am ok to go ahead for now and clean it up in another go.

@ax3l ax3l enabled auto-merge (squash) November 7, 2025 22:59
@ax3l ax3l merged commit 85b9f54 into BLAST-ImpactX:development Nov 8, 2025
16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

component: space charge Space charge & potential solver tracking: particles

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants