CPLUOS Astrophysics team meeting

Asia/Seoul
물리학과

물리학과

Zoom Meeting https://uos-ac-kr.zoom.us/j/8264902652 Meeting ID: 826 490 2652
Description

https://uos-ac-kr.zoom.us/j/8264902652

    • 16:30 17:45
      COSMOLOGY
      • 16:30
        Dr.Sangnam Park's Report - FDM_Offset 15m
        Speaker: Sangnam Park
      • 16:45
        Hyeonmo's report - FDM vs CDM Halo Collision 15m
        Speaker: Hyeonmo Koo (University of Seoul)
        • FDM vs CDM Halo Collision
          • Some Bumps in CDM Simulation
          • In Initial Velocity [28.184km/s] - Velocity Decrease [28.184km/s] graph,
            • A strange thing is seen in the velocity decrease curve of CDM, that as the initial velocity v_0 increases, Δv is not slight and decreases with a few bumps.
          • Oscillation Period of Particle
            • T_(1/2)=0.06τ_gadget
            • This value of 0.06 corresponds to the oscillation period according to time of r_(1/2) of the individual halo.
            • Since the phenomenon of two halos colliding is quite catastrophic, a perturbative interpretation is not possible.
      • 17:00
        Young's report. FoF and MST algorithm 15m
        Speaker: Young Ju
        • FoF and MST algorithm
          • The density of 50 clusters and 5k, 10k, 50k noise level
          • The algorithms should find 49 cluster at least
          • Controlled random data 1 : mass function
            • 1. make halo mass function using colossus package
            • 2. using HOD convert from mass to number of galaxies
            • 3. calculate cumulative distribution and pick up number of galaxies using uniform random number
            • 4. make cluster distribution
          • Controlled random data 1 : noise data
            • total 4542 galaxies : 5k, 10k, 50k
          • Controlled random data 1 : 5k noise data
            • member > 50 , (MST, FoF,  MGS, DBSCAN, Hierarchical)
              • (17, 3, 47, 39, 17)
          • Controlled random data 1 : 50k noise data
            • member > 50 , (MST, FoF,  MGS, DBSCAN, Hierarchical)
              • (1, 3, 331, 1, 1)
          • Controlled random data 1 : 5k noise data, linking length = 0.3 x mean separation = 2.83
            • member > 50 , (MST, FoF,  MGS, DBSCAN, Hierarchical)
              • (38, 47, 19, 14, 38)
          • Controlled random data 1 : 50k noise data
            • member > 50 , (MST, FoF,  MGS, DBSCAN, Hierarchical)
              • (1, 3, 331, 1, 1)
      • 17:15
        CHOA(Cosmology of High-Order Statistics) 15m
        Speaker: Se Yeon Hwang
        • CHOA(Cosmology of High-Order Statistics)
          • Running code using Random pairs > this should be go to 0
          • We check the 10 results
          • Estimator problem
            • In practice, for the N-point correlation function, we use estimator using pairs(DD, DR, RR)
            • There are many kinds of estimator
            • i.e. DD = (count pair)/N(N-1)
            • For getting N point correlation function, We are using estimator like this :
              • 2pcf = (DD/RR)-1
              • 3pcf = (DDD/RRR)-1
              • 4pcf = (DDDD/RRRR)-1
            • We are changing like this : using Landy&Szalay(1993)
              • 2pcf = (D-R)^2/RR
              • 3pcf = (D-R)^3/RRR
              • 4pcf = (D-R)^4/RRRR
          • What changes does the estimator make? (schematic examples on 2pcf
            • In many literatures, they said  LS estimator is more accurate
          • Methods
            • Preparing data pairs like this(referenced by Philcox(2021)
            • New catalog N which combined data and random points(α is normalized part), N = D-αR
            • New normalized random catalog, new R = αR
            • Because data catalog is already normalized, we turn off the normalized part in the code
            • So, when we calculate N pairs and R pairs,  we can get LS estimator
              • 2pcf = NN/RR = (D-αR)^2/(ααRR) = (DD - 2αDR+ααRR)/(ααRR)
              • 3pcf = NNN/RRR
              • 4pcf = NNNN/RRRR
          • When we checked through code
            • Ngram_nonorm is using N and normalized R data points. NN/RR
            • kstat(Ls) is using D and R. (DD-2DR+RR)/RR
      • 17:30
        Dr.Sabiu's Report - Density Reconstruction 15m
        Speaker: Dr Cristiano Sabiu (University of Seoul)
        • Density Reconstruction
          • 21cm Part 2!
            • We will update the previous work with a more rigorous approach
            • Use realistic foreground maps AND recover input cosmology
            • Add warm DM (Neutrinos) and add flexibility to fiducial model ie (± N-σ around Planck values)
          • Determining the underlying density field from the Distribution and 1-D velocities of Galaxies in Redshift-space via Machine Learning
    • 17:45 17:55
      Discussion 10m