Cosmology with Large scale structure using Machine Learning(CLML)
- Until now
- Data
- Pinocchio
- Using Lagrangian Perturbation Theory(LPT)
- Dark matter halo
- Model
- arXiv: 1908.10590
- 3 layers of CNN and 3 layers of fully connected layers
- Result
- What author does
- Data
- Which simulation will we use
- Each simulation has different algorithm and formailsm, so it may affect to predict cosmological parameter.
- Machine Learning may capture different aspect of each simulation. We have to check this and choose which simulaiton to use.
- Pinocchio: Evolving with LPT(Lagrangian Perturbation Theory)
- Cola
- Small-scale: Evolving with N-body
- Large-scale: Evolving with LPT(Lagrangian Perturbation Theory)
- Necola: N-body like
- What is NECOLA?
- NE-COLA(Neural Enhanced COLA) [arXiv:2111.02441v1] (2022, 11)
- NECOLA is trained with Quijote simulation which is full N-body code.
- We can correct the positions of the COLA to match the results of full N-body Quijote simulations
- It's hard to say there are big differences between them. But we can say that the middle row looks more diffuse and halos don't exhibit a high concentraiton in their centers.
- Model
- Which model will we use
- Until now, I used 3 layers of CNN and 3 layers of fully connected layers.
- We can make variation of the number of CNN layers.
- Or we can apply to totally different model like ViT(Vision Transformer)
- Cosmological parameter
- Which parameter will we predict
- Until now, we predict Ωm, σ8 in the grid space.
- But we can increase cosmological parameter and sampling through Latin Hypercube sampling(LHS)