I got about 3 pages into this and felt like I was following the discussion until the discussion of an incorrect prediction result that:
> In fact, the spurious tendencies are due to an imbalance between the pressure and wind fields resulting in large amplitude high frequency gravity wave oscillations.
Since this paper was published within the last 20 years, I can't imagine what they were referring to by 'gravity waves'
Gravity waves are waves in a fluid that obtain their restoring force from their buoyancy relative to the surrounding fluid. See [0] for the gory details.
Some examples of gravity waves in the atmosphere: [1] [2]
It's funny that you point this out, since some people in the field of astrophysics are quick to correct others when "gravity waves" are used when they really mean "gravitational waves", and sometimes vice versa.
Basically, gravity waves (or g-waves) are a type of perturbation in stratified media where the restoring force on sound waves is buoyancy. The other scenario is where the restoring force is pressure, which are pressure waves (or p-waves).
Gravity waves are important in not just planetary atmospheres but stellar media as well, particularly in the outer region of stars. A popular candidate for /gravitational/ wave sources is binary neutron star systems, so the two words aren't interchangeable there since they refer to two very different phenomena!
Are you mixing gravity waves and gravitational waves? Gravity waves are any waves in media in which gravity acts as restoring force, like waves in ocean, or waves between layers with different densities in atmosphere, or, AFAIU, whole atmosphere (think of atmosphere as a big shallow pool, with air sloshing around in the pool instead of some liquid).
just curious, do you have references to smartphone sensors being utilized in numerical weather prediction?
i'd been browsing through some WMO reports and other associated NWP lit recently and i'd seen stuff on GPS-RO, but nothing on anyone assimilating "smart" devices.
Sure. I've been working on this problem for about 5 years now, starting with the collection of barometric pressure from Android devices. I'm currently working on this with iPhones at Sunshine [1], where we collect pressure data (along with other metrics), but pressure is the most valuable.
There are researchers who use this data - we have collected about 4 billion atmospheric pressure measurements that we have distributed for academic and government research. The primary researchers are Cliff Mass and his lab at the University of Washington. There are also groups in Canada and the US that are using the data. IBM is now also collecting and using smartphone pressure data through their mobile apps. [2]
Generally speaking, the current trend is to take the live data stream, run it through a quality-control algorithm and then use kalman filters in the WRF data assimilation package.
There are some papers published, but it is still early. I will find some links to papers if you'd like to read them. [3]
"It also learns every time you actively report to the community on sky conditions and hazards, translating this information into weather predictions."
does sunshine actively run a numerical weather prediction model like WRF?
it'd be great to assimilate all these extra sensors into the NWP centre's models, but i imagine things like cal/val and WMO agreements to share data might make things difficult for commercial companies?
And yes it would be great to have all these sensor readings available to NOAA, Environment Canada, ECMWF, and everywhere. I have made lots of progress in getting them to talk about it, but it's a long road before any government starts using this data in its own NWP models.
there's some stuff being written this year as well for the side that i work (space). with all the upcoming sensor gaps, they're looking at alternative ways to cover.
If you'd like to submit your own weather observations from anywhere on the globe, the National Severe Storms Laboratory has created an app[0] to report them.
if you check out the CGAL library, they have code that solves surface parameterization problems. the least squares conformal mapping is pretty nice for creating uv maps.
Ray tracing is a more generic term that covers rendering techniques which trace rays between a camera and a scene. Back in the day everyone used Whitted style ray tracers to render shiny metal blobs because secular reflections are trivially handled with ray tracing. Unfortunately other physical aspects of light like diffuse reflections were not handled by early ray tracing algorithms.
Kajiya introduced the rendering equation, a mathematical formulation of global illumination (takes diffuse, secular into account). It's a multidimensional integral equation. Unidirectional path tracing is an algorithm he introduced to solve the rendering equation. It uses several rays traced from pixels in the camera bouncing randomly through the scene. It's a Monte Carlo integration algorithm for solving the nasty integrals.
Photon mapping involves a pass algorithm tracing light from light sources into the scene, then a pass (like path tracing) gathering that light back at the camera. It better handles more complex light effects like caustics.
> It better handles more complex light effects like caustics.
I wouldn't say better. I would say it provides an approximation to the rendering equation more quickly than path tracing does. However, photon mapping is a biased algorithm, which means that if you average many independent renderings together, they won't converge on the correct (exact) image. Path tracing methods (bi-directional, Metropolis, etc.) converge on the exact solution, regardless of how noisy each individual rendering is. (However, it may be the case that an unwieldy number of samples is required for tricky caustics, so in practice, path tracing may fail to produce a correct result because of high variance.)
Photon Mapping might be biased, but it's extremely easy to make it consistent by using the method outlined by Knaus and Zwicker [1]. Using that method photon mapping will converge to the right result. Even without progressive photon mapping you can choose a photon radius that won't cause visible errors.
i built an interactive virtual globe using open gl (like nasa worldwind/google earth, but with real time satellite imagery). it's super fun.
when you're dealing with things like constellations of satellites, it's often useful to have a map alternative, since you can only (continuously) see half the earth at once with a globe.
conformality and area are preserved on globes, both physical and virtual (like in google earth). as a kid, i remember thinking africa was huge from the globes that teachers had in classrooms.
in ~2007 or so, i actually tried to buy a physical globe for my desk. i went into a couple office supply stores and the high school age employees had never even heard of a globe. it was the strangest thing trying to explain what a globe was: "it's like a ball with a map on it".
http://www.elsevierscitech.com/emails/physics/climate/the_or...