Das Klingerln des Universums

(0 comments)

Title

Click the link below to youtube video showing how the Universe looks like. This is the first ever correct view of the Universe, so enjoy it...:)

My view of the Universe

Figure Axis description

The plots are 2-point correlations, that is, one choose a center (displayed in the legend) and create spheres of increasing radius (shown in the x axis) .Then you sum the mass for the galaxies in the surface of that sphere. The surface is not infinitely thin due to binning of the radius values, so there is a volume in the surface. That mass is presented in the coordinate y. The first plot is the most important, since it shows contains the most information. As you move the centers farther away from us, there are less and less observations. We can only observe up to on radian (the size of the Universe is one radian or the 4D radius). As we move the center of the 2-point correlation, the maximum sphere we can draw gets clipped by our observational window of 1 radian.

The number of galaxies in a small sphere near our center point is smaller, so the statistics of the first measurement (legend =0) suffers (seem from the small ripples - they are likely due to statistics). Remember, I bin the data into different radii. the larger the radius the larger the number of galaxies I can choose in my sampling. Remember, this is the result of sampling of the total number of galaxies (1.3 million). A supercomputer could do a better job than my Mac. That said, the consistent shapes presented in the current figures indicate that the statistical error does not change the qualitative conclusion of us seeing a beating. The exact shape of the figure depends upon the choice for a mass proxy. That shape might change as I think about the problem. The overlying oscillations should not (since I can't conceive a model where introducing a mass proxy will ondulate this 2-point correlation).

Baryonic Acoustic Oscillation are extremely light imprints (vibrations) on the initial mass distribution that supposedly grew up to become Galaxies Clusters in our current Universe. The current view is that these oscillations were plasma oscillations. I beg to disagree.

The reason being is that plasma sound speed is small in comparison to the speed of light. Plasma fluctuations would create smaller features in the Universe. In HU, the Universe started as a 180 light-seconds wide Black Hole. A little later (easily calculable) its density becomes one of a Neutron Star. During this phase transition, the speed of sound can compete with the light speed. I think that that is the phase where this ondulations were imprinted. As the Universe expands, the speed of sound decreases, leaving the ondulation slowly dissipating. The increased density in places leads to an increase in galaxy formation. The subsequent plasma waves create the smaller structure (clusters of galaxies).

The current explanation that Baryonic Acoustic Oscillations doesn't make sense....:) That is the reason people have trouble understanding it...:)

Das Klingeln des Universums

The Hypergeometrical Universe theory (HU) proposes that the Universe to be a lightspeed hyperspherical hypersurface. Peering into the past looks like this:

HU challenged General Relativity, Dark Energy, Friedmann-Lemaitre prescribed Universe composition by challenging the ruler used on Cosmos distance measurement. Correcting for the overestimation of distances leads to a well-behaved Universe where the maximum distance observed is 0.71 of the maximum distance possibly traversed by light (13.58 billion light years).

HU provides a simple way to calculate alpha for any given redshift Z. Since the beginning of the Universe, matter has been moving around such as to relax the Fabric of Space. HU explained that the reason things move is to relax the fabric of space. The success of HU calculation means that for most of the type 1A Supernova explosions, the Fabric of Space was relaxed. Having the FS relaxed means that the FS normal is pointing towards the radial direction. There is no tangential motion. Only radial motion under those conditions. This means that whatever alpha we can calculate, that alpha will remain the same up to the outmost hypersphere (our current epoch).

This makes it easier to calculate distances. We can just project them into the outmost hypersphere and take it from there.

Being a simple-minded accounting, shaking day-in and day-out on the NYC subway, I wasn’t able to find any help among the Astrophysicists…:) To be just, I am forever indebted to Daniel Eisenstein who was kind and replied to two of my emails. Unfortunately, the subject requires a little more information.

I will present my hypotheses (interpretation choices) made when choosing datasets and columns. I used column NZ as a number density, proxy to the galaxy mass.

Mass 2-point correlation:

This is a simple and clear concept. You sit on a given galaxy and calculate the distances to each one of the other 1,312,681. For each distance you assign its NZ. Later you bin the distances and group by distance aggregated NZs to create the Universe 2-point correlation presented on the plots. Alternatively, I could aggregate 1s for each object. Had a done that the curve would look different since the data has a larger number of stars on large distances. The ondulations should still be there. I will double check that.

This is the Universe 2-point correlation for the complete view of the Universe (as complete as the four fits files below).

This is the complete Universe composed by these four SDSS files:

These files can be found here:

galaxy_DR12v5_CMASS_North.fits .gz

galaxy_DR12v5_CMASS_South.fits .gz

galaxy_DR12v5_LOWZ_North.fits. gz

galaxy_DR12v5_LOWZ_South.fits. gz

These are large files...

One clearly can see oscillations due to the slouching of plasma at the time the Black Hole Universe started to disassemble. Current view is that this happened at the time the Universe became transparent (Plasma to Gas transition). That might be correct, but I jot down the other possibility just for the record.

The figure shows the sum of the Galactic masses within a varying radius shown on the x-axis centered at our current position. You clearly can see the ringing..:)

Frequency analysis are shown below:

One can easily recognize the bump on the 2-point correlation. This bump means that the initial acoustic oscillations resulted in a galaxy cluster around 0.7 R_0.

As we shift the center for mass 2-point correlation, one can see that the figure doesn’t change other than because of the fact that the data is limited to our horizon.

This is consistent the Universe being homogeneous and isotropic on a large scale.

The quality of the data is astounding. In fact, it is not the quality of the data. It is the quality of the model. Lambda-CDM analyse the same data wit a bad model. Below is the quality of the resulting 2-point correlation:

Same data - different model…:)

The quality of the model is so bad, that one starts to wonder if the recurrence is actually where the model is placing it. 150 Mpc doesn’t match my 0.75 R_0.

This means that L-CDM and Inflation Theory distorted distances to such a degree that an actual recurrence might have been displaced or maybe never existed in the first place. In doing so, L-CDM mislead scientist to characterize the dynamics of Dark Matter incorrectly. This recurrence is attributed to anchored Dark Matter interaction. The proposed physics requires Dark Matter to Attract Matter but not be influenced by Matter as it slouches around in the early Universe. This is a lot of new Physics that has been proposed, probably based on an artifact due the Inflation Theory.

This peak is nowhere to be found in my plots. This means that the Baryonic Acoustic Oscillations have NO Dark Matter INFLUENCE.

This is how I missed watching the Super Bowl... :)

PS - If you want to have fun with the Universe..:) just track down the SDSS files and use my Python script below.

import os
import astropy.units as u
from astropy.io import fits
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from numba.decorators import jit
import matplotlib.cm as cm
from scipy.optimize import curve_fit
import timeit


def read_test_pyfits(filename, colname):
    with fits.open(filename, memmap=True) as hdul:
        data = (hdul[1].data[colname])
        return data.copy()


def read_nobs_pyfits(filename):
    with fits.open(filename, memmap=True) as hdul:
        data = (hdul[1].data)
        return np.shape(data)[0], hdul[1].columns.names


def get_BOSS_data(gal):
    nObs, cols = read_nobs_pyfits(gal)
    colnames = [x for x in cols if x in ['ID', 'RA', 'DEC', 'Z', 'NZ', 'BOSS_SPECOBJ_ID',
                                         'BOSS_TARGET1', 'BOSS_TARGET2', 'EBOSS_TARGET0', 'ZOFFSET', 'TARGETOBJID',
                                         'OBJID', 'PLUG_RA', 'PLUG_DEC', 'Z']]
    ncols = len(colnames)
    myGalaxy = pd.DataFrame(data=np.zeros([nObs, ncols]), columns=colnames)
    for rowname in myGalaxy.columns:
        myGalaxy[rowname] = read_test_pyfits(gal, rowname).byteswap().newbyteorder()
    print(myGalaxy.columns)
    pi4 = np.pi / 4.0
    sqrt2 = np.sqrt(2)
    myGalaxy.DEC = myGalaxy.DEC.round(1)
    myGalaxy.RA = myGalaxy.RA.round(1)
    myGalaxy['CosRA'] = np.cos(myGalaxy.RA / 180.0 * np.pi)
    myGalaxy['SinRA'] = np.sin(myGalaxy.RA / 180.0 * np.pi)
    myGalaxy['CosDEC'] = np.cos((90 - myGalaxy.DEC) / 180.0 * np.pi)
    myGalaxy['SinDEC'] = np.sin((90 - myGalaxy.DEC) / 180.0 * np.pi)
    myGalaxy.Z = myGalaxy.Z.abs()
    myGalaxy['distance0'] = np.abs(zDistance(myGalaxy.Z))
    myGalaxy['distance'] = 0.0
    myGalaxy['density'] = 0.0
    myGalaxy['alpha'] = np.round(pi4 - np.arcsin(1 / sqrt2 / (1 + np.abs(myGalaxy.Z))), 3)
    myGalaxy = myGalaxy.sort_values(by=['Z'])
    myGalaxy.reset_index(drop=True, inplace=True)
    return myGalaxy


def zDistance(Z):
    pi4 = np.pi / 4.0
    sqrt2 = np.sqrt(2)
    return np.round(pi4 - np.arcsin(1 / sqrt2 / (1 + np.abs(Z))), 3)


@jit
def get_distances(myGalaxy, N1=0, N2=1):
    N2Max = myGalaxy.shape[0]
    autocorr = pd.Series(data=np.zeros([200, ]))
    totalLength = len(myGalaxy)
    if (N2 > N2Max):
        N2 = N2Max
    for i in range(N1, N2):
        CosRA = myGalaxy.iloc[i].CosRA
        SinRA = myGalaxy.iloc[i].SinRA
        CosDEC = myGalaxy.iloc[i].CosDEC
        SinDEC = myGalaxy.iloc[i].SinDEC
        r = myGalaxy.iloc[i].alpha
        rPrime = myGalaxy.alpha
        v1 = np.tile([r * CosDEC * CosRA, r * CosDEC * SinRA, r * SinDEC], myGalaxy.shape[0]).reshape(
            [myGalaxy.shape[0], 3])
        v2 = np.array([rPrime * myGalaxy.CosDEC * myGalaxy.CosRA, rPrime * myGalaxy.CosDEC * myGalaxy.SinRA,
                       rPrime * myGalaxy.SinDEC]).reshape([myGalaxy.shape[0], 3])
        myGalaxy.distance = 0
        myGalaxy.density = 0
        myGalaxy.distance = (100 * np.sqrt((v1 - v2) * (v1 - v2)).sum(axis=1)).astype(int)
        unique = np.unique(myGalaxy.distance)
        for i in unique:
            inds = myGalaxy.distance == i
            myGalaxy.loc[inds, 'density'] = myGalaxy.loc[inds, 'NZ']
        autocorr = autocorr.add(myGalaxy.groupby(['distance'])['density'].sum(), fill_value=0)
    return autocorr / (N2 - N1)

@jit
def get_map(myGalaxy):
    return myGalaxy.groupby(['RA','DEC','distance'])['density'].sum() 

def func(x, *params):
    y = np.zeros_like(x)
    for i in range(0, len(params), 3):
        ctr = params[i]
        amp = params[i + 1]
        wid = params[i + 2]
        y = y + amp * np.exp(-((x - ctr) / wid) ** 2)
    return y

if __name__=='__main__':

    gals = ['galaxy_DR12v5_CMASS_North.fits','galaxy_DR12v5_CMASS_South.fits',
           'galaxy_DR12v5_LOWZ_North.fits','galaxy_DR12v5_LOWZ_South.fits']

    myGalaxy0 = get_BOSS_data('galaxy_DR12v5_CMASS_North.fits')
    numGalaxies=myGalaxy0.shape[0]

    gal = 'galaxy_DR12v5_CMASS_South.fits';
    myGalaxy1 = get_BOSS_data(gal)
    numGalaxies1=myGalaxy1.shape[0]

    gal = 'galaxy_DR12v5_LOWZ_North.fits';
    myGalaxy2 = get_BOSS_data(gal)
    numGalaxies2=myGalaxy2.shape[0]

    gal = 'galaxy_DR12v5_LOWZ_South.fits';
    myGalaxy3 = get_BOSS_data(gal)
    numGalaxies3=myGalaxy3.shape[0]

    myGalaxy= pd.concat([myGalaxy0,myGalaxy1,myGalaxy2,myGalaxy3])

    chunckGalaxies=5
    maxNum=myGalaxy.shape[0]
    dMax=myGalaxy.distance0.max()
    n=21
    positions = np.round([i*dMax/n for i in range(n-1)],3)
    inds=[]
    for pos in positions:
        indsGroup = myGalaxy[myGalaxy['distance0']==pos].index.tolist() 
        if (len(indsGroup)!=0):
            inds.append(min(myGalaxy.index[indsGroup]))

    start_time = timeit.default_timer()


    chunckGalaxies=500
    maxNum=myGalaxy.shape[0]
    dMax=myGalaxy.distance0.max()
    n=21
    positions = np.round([i*dMax/n for i in range(n-1)],3)
    inds=[]
    for pos in positions:
        indsGroup = myGalaxy[myGalaxy['distance0']==pos].index.tolist() 
        if (len(indsGroup)!=0):
            inds.append(min(indsGroup))

    autocorr={}
    N1=0
    for i in inds[1:]:
        if((i-N1)>chunckGalaxies):
            N2=N1+chunckGalaxies
        else:
            N2=i
        autocorr[myGalaxy.distance0.iloc[i]] = get_distances(myGalaxy,N1=N1,N2=N2)
        N1=i
        elapsed = timeit.default_timer() - start_time
        print(i, zDistance(myGalaxy.Z.iloc[i]), elapsed)


    df=pd.DataFrame.from_dict(data=autocorr)
    df.columns=['centered at '+ str(x) for x in df.columns]
    x=df.index*0.01
    df.plot(x=x, y=df.columns[0],title='Universe 2-point correlation', legend=True).set_xlabel('2-point correlation')
    fig = plt.gcf()
    fig.set_size_inches(18.5, 10.5)
    plt.xlim([0,1.2])
    plt.ylim([0,20])
    fig.savefig('./CloseUniverseFina0.png', dpi=100)
    df.to_excel('./CloseUniverseFinal0.xlsx')

    df=pd.DataFrame.from_dict(data=autocorr)
    df.columns=['centered at '+ str(x) for x in df.columns]
    x=df.index*0.01
    df.plot(x=x, y=df.columns[0:10],title='Universe 2-point correlation', legend=True).set_xlabel('2-point correlation')
    fig = plt.gcf()
    fig.set_size_inches(18.5, 10.5)
    plt.xlim([0,1.2])
    plt.ylim([0,20])
    fig.savefig('./CloseUniverseFinal1.png', dpi=100)
    df.to_excel('./CloseUniverseFinal1.xlsx')

    df=pd.DataFrame.from_dict(data=autocorr)
    df.columns=['centered at '+ str(x) for x in df.columns]
    x=df.index*0.01
    df.plot(x=x, y=df.columns[10:15],title='Universe 2-point correlation', legend=True).set_xlabel('2-point correlation')
    fig = plt.gcf()
    fig.set_size_inches(18.5, 10.5)
    plt.xlim([0,1.2])
    plt.ylim([0,20])
    fig.savefig('./CloseUniverseFina2.png', dpi=100)
    df.to_excel('./CloseUniverseFinal2.xlsx')

    fig=plt.figure()
    df=pd.DataFrame.from_dict(data=autocorr)
    ax=plt.plot(np.fft.rfft(df[df.columns[0]]))
    plt.xlim([0,30])
    plt.ylim([-15,35])
    plt.xlabel('Universe Ringing Frequency')
    plt.grid(True)
    fig.set_size_inches(9.5, 5.5)
    plt.xticks([i for i in range(0,30)])
    plt.tick_params(axis='x')
    fig.savefig('./UniverseRinging1.png', dpi=200)
    df.to_csv('./UniverseRinging.xlsx')

    fig=plt.figure()
    ax=plt.plot(np.fft.rfft(df[df.columns[0]]))
    plt.xlim([0,10])
    plt.ylim([-15,40])
    plt.xlabel('Universe Ringing Frequency')
    plt.grid(True)
    fig.set_size_inches(9.5, 5.5)
    plt.xticks([i for i in range(0,10)])
    plt.tick_params(axis='x')
    fig.savefig('./UniverseRinging2.png', dpi=200)

    fig=plt.figure()
    ax=plt.plot(np.fft.rfft(df[df.columns[0]]))
    plt.xlim([0,10])
    plt.ylim([-15,240])
    plt.xlabel('Universe Ringing Frequency')
    plt.grid(True)
    fig.set_size_inches(9.5, 5.5)
    plt.xticks([i for i in range(0,10)])
    plt.tick_params(axis='x')
    fig.savefig('./UniverseRinging3.png', dpi=200)

Current rating: 5

Comments

There are currently no comments

New Comment

required

required (not published)

optional

required

Archive

2017
2016
2015
2014
2013
2012
2011
2010
2009
2008
2007
2006

Categories

Authors

Feeds

RSS / Atom