Welcome, Guest

Author Topic: Coding  (Read 179945 times)

atomic7732

  • Global Moderator
  • *****
  • Posts: 3828
  • caught in the river turning blue
    • Paladin of Storms
Re: Coding
« Reply #750 on: July 04, 2019, 01:31:09 AM »
can we get a subtraction (or other diff) between the two pics

FiahOwl

  • *****
  • Posts: 1234
  • This is, to give a dog and in recompense desire my dog again.
Re: Coding
« Reply #751 on: July 04, 2019, 11:02:23 AM »
here you go

Darvince

  • *****
  • Posts: 1835
  • 差不多
Re: Coding
« Reply #752 on: July 04, 2019, 08:38:05 PM »
This is, to give a dog and in recompense desire my dog again.

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #753 on: August 26, 2019, 10:24:48 PM »


new drizzle experiment

bottom right = high res image of a 2d signal
top left = a simulated image of that signal, where the imaging sensor is 21 x 21 px
going from top left to bottom right, the other images are 10, 100, and 1000 images taken with *the same low res sensor*, but drizzled into a high resolution map. the imaging sensor is jittered with random 1 px translation and random 0.05 radian rotation each time.

i think it's fairly cool to see exactly how much detail can be retrieved even with such a low res sensor. presumably some sort of deconvolution can be applied to undo the blurring caused by finite sized photosites here.

atomic7732

  • Global Moderator
  • *****
  • Posts: 3828
  • caught in the river turning blue
    • Paladin of Storms
Re: Coding
« Reply #754 on: August 27, 2019, 02:24:58 AM »
interestong

why isn't this technique used more

(maybe it is already used quite a bit in some fields, i wouldn't know)

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #755 on: August 27, 2019, 12:58:10 PM »
afaik it was actually invented for processing Hubble images, where the resolution of the optics exceeds the resolution of the sensor

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #756 on: August 27, 2019, 11:55:31 PM »


zoom to see detail... but basically this is the above drizzling technique applied to super-res a scene i took 20 images of.

right is the original, left is drizzled

still a bunch of artifacts because it's not quite perfected yet. also both images have a weird tint because i haven't figured out how to apply color correction / calibration yet

tuto99

  • *****
  • Posts: 533
  • Baba Booey
Re: Coding
« Reply #757 on: August 31, 2019, 08:32:14 AM »
I'm pretty sure there's a threshold for the input image quality right? Like at what point would there not be enough detail to make a reasonably detailed drizzled image?

Also isn't this photo-optimizing technique, or a technique like this, used for security camera snapshots? I know you see the stuff in movies but it's a real thing right? (Or for like license plates)

atomic7732

  • Global Moderator
  • *****
  • Posts: 3828
  • caught in the river turning blue
    • Paladin of Storms
Re: Coding
« Reply #758 on: August 31, 2019, 11:29:26 AM »
ENHANCE

tuto99

  • *****
  • Posts: 533
  • Baba Booey
Re: Coding
« Reply #759 on: August 31, 2019, 01:04:11 PM »
That was the word I was trying to remember mess

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #760 on: August 31, 2019, 04:44:58 PM »
I'm pretty sure there's a threshold for the input image quality right? Like at what point would there not be enough detail to make a reasonably detailed drizzled image?

Also isn't this photo-optimizing technique, or a technique like this, used for security camera snapshots? I know you see the stuff in movies but it's a real thing right? (Or for like license plates)

well it's not magic. basically there are two bottlenecks (well there are many, but relevant to us) to capturing more details: the quality of the optics in the camera, and the resolution of the sensor.

if sensor resolution is the bottleneck -- say each photosite on the camera sensor is 1 mm across, then by shifting the camera exactly 0.5mm and taking another photo, you're basically doubling the resolution of the sensor. and mathematically you can take advantage of that.

of course optics is still a limiting factor, so you can't get infinite super-res this way.


vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #761 on: October 06, 2019, 01:33:48 AM »
today we have an optiksim update!!!!

previously we had this

today i wrote a massive amount of code in order to implement fully automated gay luxury gay space lens design. ok not really, but it was very tricky to get right. and i'm kind of surprised it works at all. and the result is that instead of hand designing lenses, i have an optimization algorithm to do it for me. it is fairly amazing

before optimization


after optimization


observe compared to my previous lens design the sharpness at all angles, instead of only directly in the middle of the image.

the design is inspired by the zeiss protar, considered to be the first modern photographic lens


my original lens was pretty much completely unusable. my new optimized lens has an average "sharpness" of 0.37mm (the plots on the right are 1mm^2. a modern camera's photosites are about 0.005mm across. so i have a long ways to go. but i calculated it and this lens should be able to generate sharp images up to 200 by 200 px!!!

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #762 on: October 19, 2019, 09:31:42 PM »
tmc's game against <the machine>

https://lichess.org/N5vtK3S9#0

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #763 on: October 20, 2019, 12:11:06 PM »
high res gcmap project (click to embiggen)




vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #764 on: October 20, 2019, 01:01:35 PM »
code

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #765 on: July 24, 2020, 08:19:05 PM »
owo wats this


vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #766 on: July 24, 2020, 10:23:01 PM »
i finally hashed out all the issues with my gravity code (which has to take into account periodic boundary conditions and forces)

2^24 ~ 17 million particles -> i can run one timestep per 41 seconds (order 4 chebyshev) or 30 seconds (using order 2)
2^22 ~ 4 million particles -> one timestep every 7 seconds
2^20 ~ 1 million particles -> one timestep every 4 seconds (not really efficient)



next steps are probably
1. figuring out how to draw 17 million objects, cause matplotlib sure isn't going to like it
2. figure out all the right constants so my simulation doesn't blow up
3. running the damn thing

atomic7732

  • Global Moderator
  • *****
  • Posts: 3828
  • caught in the river turning blue
    • Paladin of Storms
Re: Coding
« Reply #767 on: July 25, 2020, 12:29:31 AM »
universe sandbox 4 is coming along nicely

Bla

  • Global Moderator
  • *****
  • Posts: 1013
  • The stars died so you can live.
Re: Coding
« Reply #768 on: July 25, 2020, 08:44:41 AM »
That sounds like good performance. About drawing I worked on a simulation once where I used a 2d histogram counting particles within a column and using a density scale which worked well compared to plotting individual particles and I think it was faster

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #769 on: July 25, 2020, 08:45:48 AM »
my previous gravity sim was in real world SI/metric units and all that, but this time around, i decided to work in "simulation units", which i call ud, ut, and um (unit distance, unit time, unit mass). so every object in my simulation has mass 1 um, each time step is 1 ut, and the entire simulation fits in a box of size 1 ud by 1 ud.

then (in theory, haven't tested this yet), the only thing i have to change about my sim is to convert the gravitational constant from metric units to sim units.

Code: [Select]
        ud_over_m = size
        ut_over_s = timestep
        um_over_kg = (density*size**2) / nbodies
        gravity_u = gravity * (um_over_kg * ut_over_s**2) / ud_over_m**3

which comes out to roughly 4e-16 right now

high school "dimensional analysis" practice really came in handy here

the only tricky point is that i'm trying to simulate cosmological expansion as well, which means that the size of my universe is NOT constant, but since the size of my sim world is constant, this means ud_over_m and therefore the gravitational constant itself is... not constant

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #770 on: July 25, 2020, 09:43:00 AM »


1024 objects initialized on a 32x32 grid -- you can actually see the starting configuration pretty clearly

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #771 on: July 25, 2020, 05:40:38 PM »

atomic7732

  • Global Moderator
  • *****
  • Posts: 3828
  • caught in the river turning blue
    • Paladin of Storms
Re: Coding
« Reply #772 on: July 26, 2020, 04:11:04 AM »
steamboat

Darvince

  • *****
  • Posts: 1835
  • 差不多
Re: Coding
« Reply #773 on: July 26, 2020, 03:47:11 PM »
code steamboat so you can see it before you die

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #774 on: July 31, 2020, 02:19:44 PM »
silly me didn't realize until today that the hubble constant isn't constant

y axis: scale-factor of the simulation, 1 being roughly "size at current time"
x axis: time, in 100-petaseconds

so the growth of the universe is indeed closer to linear than exponential (what i thought it was before)



this could explain why my results didn't make sense

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #775 on: August 01, 2020, 12:10:08 PM »
so i realized what i was doing wrong finally, (in this and months of months of previous attempts). long story short, i wasn't using the correct initial conditions. i managed to track down some lecture notes and papers on how people actually go about setting up the initial conditions of these simulations, and i didn't understand any of the math, and a bunch of the math that i even tried to implement gives bogus results. so i ended up just getting some sort of approximation which doesn't actually use the math (bad), but visually looks about right.



it's so pretty

atomic7732

  • Global Moderator
  • *****
  • Posts: 3828
  • caught in the river turning blue
    • Paladin of Storms
Re: Coding
« Reply #776 on: August 01, 2020, 12:50:42 PM »
wow the warm-hot intergalactic medium

vh

  • formerly mudkipz
  • *****
  • Posts: 1138
  • "giving heat meaning"
Re: Coding
« Reply #777 on: August 30, 2020, 11:03:33 AM »
today... practicing some time series analysis by grabbing some of my co2 ppm data.

to keep the challenge reasonable, i selected a subset of the data, 3 weeks long, where i didn't leave my apartment for any trips (this was surprisingly hard to find). i used the first 2 weeks to develop my prediction model, then verified the model on the last week of data.

i first averaged the data in 15-minute groups, just to cut down on the computation load by a factor of 15.
the data exhibits both daily and also weekly patterns, which i factored out in the data preprocessing stages (only to invert the transformation later, of course)
finally, i found it unnecessary to detrend the data

(light blue on the left 2 weeks is my input data)


both a KPSS and ADF test suggested that the data was already quite stationary without any differencing, so i just threw it into an ARIMA model with a grid search (50 experiments) over pdq parameters with corrected aic as the criteria, and i found that (1,0,4) was the optimal parameters

finally, i ran the forecast prodecure (which quickly regressed to the mean, as expected, since the default stance to take when predicting far into the future is to regress to the mean). i also simulated 100 trajectories from the ARIMA model in order to quickly visualize the credible bounds of my forecast

the final results are shown, with dark blue being the actual observed data, yellow being the forecasted PPM (although, as mentioned previously, this quickly regresses to the seasonal means after about a day), and red being the 100 simulated futures. all in all, the forecast seems decent, as the ground truth trajectory rarely deviates from the credible intervals


the one-step prediction (generally not a very useful indication of forecast ability, as it only measures accuracy of forecasting a single step in the future (aka 15 minutes)) had an RMSE of 12.9 ppm, compared to the naive strategy (always predicting the last seen value) at 20.45, and the seasonal-trend-following strategy at 14.0 ppm.
« Last Edit: August 30, 2020, 11:15:33 AM by vh »