Ari Rapkin Blenkhorn

Ari Rapkin Blenkhorn's Photo
PhD in Computer Science, December 2018
University of Maryland, Baltimore County
 
ari@acm.org
 
CV (August 2019)
https://www.linkedin.com/in/ariblenkhorn


Dissertation Research

GPU-accelerated rendering of atmospheric glories

Glories are colorful atmospheric phenomena related to rainbows and coronas. They are commonly seen from aircraft when clouds are present and the sun is on the opposite side of the aircraft from the observer.

I have developed a highly-parallel GPGPU implementation of the Mie scattering equations which accelerates calculation of per-wavelength light scattering. The Mie calculations for each scattering angle and wavelength of light are independent of the calculations for any other, and can be performed simultaneously. My implementation dispatches large groups of these calculations to the GPU to process in parallel. I use a two-dimensional Sobol sequence to sample from (wavelength, scattering angle) space. The 2D Sobol technique ensures that the samples are well-distributed, without large gaps or clumps, thereby reducing the number of scattering calculations needed to achieve visually-acceptable results. The Sobol sampling calculations are also performed in parallel and use a recently-developed technique which precomputes partial results. Overall this work renders atmospheric glories at much faster speeds than previous serial CPU techniques, while maintaining high levels of visual fidelity as measured by both physical and perceptual image error metrics. Additionally, it yields equivalent-quality results with far fewer Mie calculations. The results obtained for glories apply fully or in part to related atmospheric phenomena.

My goal is to produce perceptually-accurate images of atmospheric phenomena at real-time rates for use in games, VR, and other interactive applications.

Glory on clouds

Poster presented at SIGGRAPH 2015.


Additional Research

RatCAVE: calibration of a projection virtual reality system

Collaboration between UMBC computer science researchers and Howard Hughes Medical Institute neuroscientists.

We have created a suite of automated tools to calibrate and configure a projection virtual reality system. Test subjects (rats) explore an interactive computer-graphics environment presented on a large curved screen using multiple projectors. The locations and characteristics of the projectors can vary and the shape of the screen may be complex. We reconstruct the 3D geometry of the screen and the location of each projector using shape-from-motion and structured-light multi-camera computer vision techniques. We determine which projected pixel corresponds to a given view direction for the rat and store this information in a warp map for each projector. The projector uses that view direction to look up pixel colors in an animated cubemap. The result is a pre-distorted output image which appears undistorted to the rat's viewpoint when displayed to the screen.

RatCAVE diagram

Poster presented at SIGGRAPH 2016.


Brief Biography

Education

Employment Highlights

I have one US Patent, two skydiving world records, and credits on six films.
My Erdos number is 4, and my Bacon number is 3.