Mitsuba Renderer  0.5.0
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Groups Pages
sensor.h
Go to the documentation of this file.
1 /*
2  This file is part of Mitsuba, a physically based rendering system.
3 
4  Copyright (c) 2007-2014 by Wenzel Jakob and others.
5 
6  Mitsuba is free software; you can redistribute it and/or modify
7  it under the terms of the GNU General Public License Version 3
8  as published by the Free Software Foundation.
9 
10  Mitsuba is distributed in the hope that it will be useful,
11  but WITHOUT ANY WARRANTY; without even the implied warranty of
12  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13  GNU General Public License for more details.
14 
15  You should have received a copy of the GNU General Public License
16  along with this program. If not, see <http://www.gnu.org/licenses/>.
17 */
18 
19 #pragma once
20 #if !defined(__MITSUBA_RENDER_SENSOR_H_)
21 #define __MITSUBA_RENDER_SENSOR_H_
22 
23 #include <mitsuba/render/common.h>
24 #include <mitsuba/render/film.h>
25 #include <mitsuba/render/emitter.h>
26 
28 
29 /**
30  * \brief Abstract sensor interface
31  *
32  * This class provides an abstract interface to all sensor plugins in Mitsuba.
33  * It exposes functions for evaluating and sampling the response function of the
34  * sensor, and it allows querying the probability density of the sampling method.
35  *
36  * Somewhat curiously, the \ref Sensor class derives from \ref AbstractEmitter.
37  * The reason for this is that much like radiance, the spectral response of a
38  * sensor can be interpreted as emitted quantity named \a importance. The
39  * \ref Sensor interface thus inherits almost all of the emitter API and only
40  * needs to add a few camera-specific methods on top.
41  *
42  * The concept of interpreting sensor response as an emitted quantity and
43  * the resulting flexibility of being able to dynamically transition between
44  * emitter and receiver interpretations of luminaires and sensors is a key
45  * insight that enables the construction of powerful bidirectional rendering
46  * techniques It is the reason why the API to these components may seem
47  * somewhat unorthodox.
48  *
49  * In Mitsuba, a sensor can be as simple as an irradiance meter that performs a
50  * single measurement along a specified ray, but it can also represent sensors
51  * that are more commonly used in computer graphics, such as a perspective camera
52  * based on the thin lens equation.
53  *
54  * An important difference between a luminaire and a sensor is that the sensor
55  * records spectral measurements to a film, and for that reason it needs a
56  * mapping between rays and film pixel coordinates. Apart from that, the
57  * interfaces are almost identical.
58  *
59  * Mitsuba assumes that a sensor always has a form of "shutter", which opens
60  * for a certain time, during which the exposure takes place. The sensor
61  * itself may also undergo motion while the shutter is open, but a more
62  * complicated dependence on time is not allowed.
63  *
64  * \ingroup librender
65  */
67 public:
68  /**
69  * \brief This list of flags is used to additionally characterize
70  * and classify the response functions of different types of sensors
71  *
72  * \sa AbstractEmitter::EEmitterType
73  */
74  enum ESensorFlags {
75  /// Sensor response contains a Dirac delta term with respect to time
76  EDeltaTime = 0x010,
77 
78  /// Does the \ref sampleRay() function need an aperture sample?
79  ENeedsApertureSample = 0x020,
80 
81  /// Is the sensor a projective camera?
82  EProjectiveCamera = 0x100,
83 
84  /// Is the sensor a perspective camera?
85  EPerspectiveCamera = 0x200,
86 
87  /// Is the sensor an orthographic camera?
88  EOrthographicCamera = 0x400,
89 
90  /// Does the sample given to \ref samplePosition() determine the pixel coordinates
91  EPositionSampleMapsToPixels = 0x1000,
92 
93  /// Does the sample given to \ref sampleDirection() determine the pixel coordinates
94  EDirectionSampleMapsToPixels = 0x2000
95  };
96 
97  // =============================================================
98  //! @{ \name Additional sensor-related sampling functions
99  // =============================================================
100 
101  /**
102  * \brief Importance sample a ray according to the sensor response
103  *
104  * This function combines all three of the steps of sampling a time,
105  * ray position, and direction value. It does not return any auxiliary
106  * sampling information and is mainly meant to be used by unidirectional
107  * rendering techniques.
108  *
109  * Note that this function potentially uses a different sampling
110  * strategy compared to the sequence of running \ref sampleArea()
111  * and \ref sampleDirection(). The reason for this is that it may
112  * be possible to switch to a better technique when sampling both
113  * position and direction at the same time.
114  *
115  * \param ray
116  * A ray data structure to be populated with a position
117  * and direction value
118  *
119  * \param samplePosition
120  * Denotes the desired sample position in fractional pixel
121  * coordinates relative to the crop window of the underlying
122  * film.
123  *
124  * \param apertureSample
125  * A uniformly distributed 2D vector that is used to sample
126  * a position on the aperture of the sensor if necessary.
127  * (Any value is valid when \ref needsApertureSample() == \c false)
128  *
129  * \param timeSample
130  * A uniformly distributed 1D vector that is used to sample
131  * the temporal component of the emission profile.
132  * (Or any value when \ref needsTimeSample() == \c false)
133  *
134  * \return
135  * An importance weight associated with the sampled ray.
136  * This accounts for the difference between the sensor response
137  * and the sampling density function.
138  *
139  * \remark
140  * In the Python API, the signature of this function is
141  * <tt>spectrum, ray = sensor.sampleRay(samplePosition, apertureSample)</tt>
142  */
143  virtual Spectrum sampleRay(Ray &ray,
144  const Point2 &samplePosition,
145  const Point2 &apertureSample,
146  Float timeSample) const = 0;
147 
148  /**
149  * \brief Importance sample a ray differential according to the
150  * sensor response
151  *
152  * This function combines all three of the steps of sampling a time,
153  * ray position, and direction value. It does not return any auxiliary
154  * sampling information and is mainly meant to be used by unidirectional
155  * rendering techniques.
156  *
157  * Note that this function potentially uses a different sampling
158  * strategy compared to the sequence of running \ref sampleArea()
159  * and \ref sampleDirection(). The reason for this is that it may
160  * be possible to switch to a better technique when sampling both
161  * position and direction at the same time.
162  *
163  * The default implementation computes differentials using several
164  * internal calls to \ref sampleRay(). Subclasses of the \ref Sensor
165  * interface may optionally provide a more efficient approach.
166  *
167  * \param ray
168  * A ray data structure to be populated with a position
169  * and direction value
170  *
171  * \param samplePosition
172  * Denotes the desired sample position in fractional pixel
173  * coordinates relative to the crop window of the underlying
174  * film.
175  *
176  * \param apertureSample
177  * A uniformly distributed 2D vector that is used to sample
178  * a position on the aperture of the sensor if necessary.
179  * (Any value is valid when \ref needsApertureSample() == \c false)
180 
181  * \param timeSample
182  * A uniformly distributed 1D vector that is used to sample
183  * the temporal component of the emission profile.
184  * (Or any value when \ref needsTimeSample() == \c false)
185  *
186  * \return
187  * An importance weight associated with the sampled ray.
188  * This accounts for the difference between the sensor response
189  * and the sampling density function.
190  *
191  * \remark
192  * In the Python API, the signature of this function is
193  * <tt>spectrum, ray = sensor.sampleRayDifferential(samplePosition, apertureSample)</tt>
194  */
195  virtual Spectrum sampleRayDifferential(RayDifferential &ray,
196  const Point2 &samplePosition,
197  const Point2 &apertureSample,
198  Float timeSample) const;
199 
200  /// Importance sample the temporal part of the sensor response function
201  inline Float sampleTime(Float sample) const {
202  return m_shutterOpen + m_shutterOpenTime * sample;
203  }
204 
205  //! @}
206  // =============================================================
207 
208  // =============================================================
209  //! @{ \name Additional query functions
210  // =============================================================
211 
212  /**
213  * \brief Return the emitted importance for the given surface intersection
214  *
215  * This is function is used when a sensor has been hit by a
216  * ray in a particle tracing-style integrator, and it subsequently needs to
217  * be queried for the emitted importance along the negative ray direction.
218  *
219  * It efficiently computes the product of \ref evalPosition()
220  * and \ref evalDirection(), though note that it does not include the
221  * cosine foreshortening factor of the latter method.
222  *
223  * This function is provided here as a fast convenience function for
224  * unidirectional rendering techniques that support intersecting the
225  * sensor. The default implementation throws an exception, which
226  * states that the method is not implemented.
227  *
228  * \param its
229  * An intersect record that specfies the query position
230  *
231  * \param d
232  * A unit vector, which specifies the query direction
233  *
234  * \param result
235  * This argument is used to return the 2D sample position
236  * (i.e. the fractional pixel coordinates) associated
237  * with the intersection.
238  *
239  * \return
240  * The emitted importance
241  *
242  * \remark
243  * In the Python API, the signature of this function is
244  * <tt>spectrum, samplePos = sensor.eval(its, d)</tt>
245  */
246  virtual Spectrum eval(const Intersection &its, const Vector &d,
247  Point2 &samplePos) const;
248 
249  /**
250  * \brief Return the sample position associated with a given
251  * position and direction sampling record
252  *
253  * \param dRec
254  * A direction sampling record, which specifies the query direction
255  *
256  * \param pRec
257  * A position sampling record, which specifies the query position
258  *
259  * \return \c true if the specified ray is visible by the camera
260  *
261  * \remark
262  * In the Python API, the signature of this function is
263  * <tt>visible, position = sensor.getSamplePosition(pRec, dRec)</tt>
264  */
265  virtual bool getSamplePosition(const PositionSamplingRecord &pRec,
266  const DirectionSamplingRecord &dRec, Point2 &position) const;
267 
268  /**
269  * \brief Evaluate the temporal component of the sampling density
270  * implemented by the \ref sampleRay() method.
271  */
272  Float pdfTime(const Ray &ray, EMeasure measure) const;
273 
274  /// Return the time value of the shutter opening event
275  inline Float getShutterOpen() const { return m_shutterOpen; }
276 
277  /// Set the time value of the shutter opening event
278  void setShutterOpen(Float time) { m_shutterOpen = time; }
279 
280  /// Return the length, for which the shutter remains open
281  inline Float getShutterOpenTime() const { return m_shutterOpenTime; }
282 
283  /// Set the length, for which the shutter remains open
284  void setShutterOpenTime(Float time);
285 
286  /**
287  * \brief Does the method \ref sampleRay() require a uniformly distributed
288  * sample for the time-dependent component?
289  */
290  inline bool needsTimeSample() const { return !(m_type & EDeltaTime); }
291 
292  //! @}
293  // =============================================================
294 
295  // =============================================================
296  //! @{ \name Miscellaneous
297  // =============================================================
298 
299  /**
300  * \brief Does the method \ref sampleRay() require a uniformly
301  * distributed sample for the aperture component?
302  */
303  inline bool needsApertureSample() const { return m_type & ENeedsApertureSample; }
304 
305  /// Return the \ref Film instance associated with this sensor
306  inline Film *getFilm() { return m_film; }
307 
308  /// Return the \ref Film instance associated with this sensor (const)
309  inline const Film *getFilm() const { return m_film.get(); }
310 
311  /// Return the aspect ratio of the sensor and its underlying film
312  inline Float getAspect() const { return m_aspect; }
313 
314  /**
315  * \brief Return the sensor's sample generator
316  *
317  * This is the \a root sampler, which will later be cloned a
318  * number of times to provide each participating worker thread
319  * with its own instance (see \ref Scene::getSampler()).
320  * Therefore, this sampler should never be used for anything
321  * except creating clones.
322  */
323  inline Sampler *getSampler() { return m_sampler; }
324 
325  /**
326  * \brief Return the sensor's sampler (const version).
327  *
328  * This is the \a root sampler, which will later be cloned a
329  * number of times to provide each participating worker thread
330  * with its own instance (see \ref Scene::getSampler()).
331  * Therefore, this sampler should never be used for anything
332  * except creating clones.
333  */
334  inline const Sampler *getSampler() const { return m_sampler.get(); }
335 
336  /// Serialize this sensor to a binary data stream
337  virtual void serialize(Stream *stream, InstanceManager *manager) const;
338 
339  //! @}
340  // =============================================================
341 
342  // =============================================================
343  //! @{ \name ConfigurableObject interface
344  // =============================================================
345  /// Add a child ConfigurableObject
346  virtual void addChild(const std::string &name, ConfigurableObject *child);
347  /// Add an unnamed child
348  inline void addChild(ConfigurableObject *child) { addChild("", child); }
349 
350  /** \brief Configure the object (called \a once after construction
351  and addition of all child \ref ConfigurableObject instances). */
352  virtual void configure();
353 
354  //! @}
355  // =============================================================
356 
358 protected:
359  /// Construct a new sensor instance
360  Sensor(const Properties &props);
361 
362  /// Unserialize a sensor instance from a binary data stream
363  Sensor(Stream *stream, InstanceManager *manager);
364 
365  /// Virtual destructor
366  virtual ~Sensor();
367 protected:
368  ref<Film> m_film;
369  ref<Sampler> m_sampler;
370  Vector2 m_resolution;
371  Vector2 m_invResolution;
372  Float m_shutterOpen;
373  Float m_shutterOpenTime;
374  Float m_aspect;
375 };
376 
377 /**
378  * \brief Projective camera interface
379  *
380  * This class provides an abstract interface to several types of sensors that
381  * are commonly used in computer graphics, such as perspective and orthographic
382  * camera models.
383  *
384  * The interface is meant to be implemented by any kind of sensor, whose
385  * world to clip space transformation can be explained using only linear
386  * operations on homogeneous coordinates.
387  *
388  * A useful feature of \ref ProjectiveCamera sensors is that their view can be
389  * rendered using the traditional OpenGL pipeline.
390  *
391  * \ingroup librender
392  */
393 class MTS_EXPORT_RENDER ProjectiveCamera : public Sensor {
394 public:
396 
397  /// Return the world-to-view (aka "view") transformation at time \c t
398  inline const Transform getViewTransform(Float t) const {
399  return getWorldTransform()->eval(t).inverse();
400  }
401 
402  /// Return the view-to-world transformation at time \c t
403  inline const Transform getWorldTransform(Float t) const {
404  return getWorldTransform()->eval(t);
405  }
406 
407  /**
408  * \brief Overwrite the view-to-world transformation
409  * with a static (i.e. non-animated) transformation.
410  */
411  void setWorldTransform(const Transform &trafo);
412 
413  /**
414  * \brief Overwrite the view-to-world transformation
415  * with an animated transformation
416  */
418 
419  /**
420  * \brief Return a projection matrix suitable for rendering the
421  * scene using OpenGL
422  *
423  * For scenes involving a narrow depth of field and antialiasing,
424  * it is necessary to average many separately rendered images using
425  * different pixel offsets and aperture positions.
426  *
427  * \param apertureSample
428  * Sample for rendering with defocus blur. This should be a
429  * uniformly distributed random point in [0,1]^2 (or any value
430  * when \ref needsApertureSample() == \c false)
431  *
432  * \param aaSample
433  * Sample for antialiasing. This should be a uniformly
434  * distributed random point in [0,1]^2.
435  */
436  virtual Transform getProjectionTransform(const Point2 &apertureSample,
437  const Point2 &aaSample) const = 0;
438 
439  /// Serialize this camera to a binary data stream
440  virtual void serialize(Stream *stream, InstanceManager *manager) const;
441 
442  /// Return the near clip plane distance
443  inline Float getNearClip() const { return m_nearClip; }
444 
445  /// Set the near clip plane distance
446  void setNearClip(Float nearClip);
447 
448  /// Return the far clip plane distance
449  inline Float getFarClip() const { return m_farClip; }
450 
451  /// Set the far clip plane distance
452  void setFarClip(Float farClip);
453 
454  /// Return the distance to the focal plane
455  inline Float getFocusDistance() const { return m_focusDistance; }
456 
457  /// Set the distance to the focal plane
458  void setFocusDistance(Float focusDistance);
459 
461 protected:
462  /// Construct a new camera instance
463  ProjectiveCamera(const Properties &props);
464 
465  /// Unserialize a camera instance from a binary data stream
466  ProjectiveCamera(Stream *stream, InstanceManager *manager);
467 
468  /// Virtual destructor
469  virtual ~ProjectiveCamera();
470 protected:
471  Float m_nearClip;
472  Float m_farClip;
473  Float m_focusDistance;
474 };
475 
476 /**
477  * \brief Perspective camera interface
478  *
479  * This class provides an abstract interface to several types of sensors that
480  * are commonly used in computer graphics, such as perspective and orthographic
481  * camera models.
482  *
483  * The interface is meant to be implemented by any kind of sensor, whose
484  * world to clip space transformation can be explained using only linear
485  * operations on homogeneous coordinates.
486  *
487  * A useful feature of \ref ProjectiveCamera sensors is that their view can be
488  * rendered using the traditional OpenGL pipeline.
489  *
490  * \ingroup librender
491  */
492 class MTS_EXPORT_RENDER PerspectiveCamera : public ProjectiveCamera {
493 public:
494  // =============================================================
495  //! @{ \name Field of view-related
496  // =============================================================
497 
498  /// Return the horizontal field of view in degrees
499  inline Float getXFov() const { return m_xfov; }
500 
501  /// Set the horizontal field of view in degrees
502  void setXFov(Float xfov);
503 
504  /// Return the vertical field of view in degrees
505  Float getYFov() const;
506 
507  /// Set the vertical field of view in degrees
508  void setYFov(Float yfov);
509 
510  /// Return the diagonal field of view in degrees
511  Float getDiagonalFov() const;
512 
513  /// Set the diagonal field of view in degrees
514  void setDiagonalFov(Float dfov);
515 
516  //! @}
517  // =============================================================
518 
519  /** \brief Configure the object (called \a once after construction
520  and addition of all child \ref ConfigurableObject instances). */
521  virtual void configure();
522 
523  /// Serialize this camera to a binary data stream
524  virtual void serialize(Stream *stream, InstanceManager *manager) const;
525 
527 protected:
528  /// Construct a new perspective camera instance
529  PerspectiveCamera(const Properties &props);
530 
531  /// Unserialize a perspective camera instance from a binary data stream
532  PerspectiveCamera(Stream *stream, InstanceManager *manager);
533 
534  /// Virtual destructor
535  virtual ~PerspectiveCamera();
536 protected:
537  Float m_xfov;
538 };
539 
541 
542 #endif /* __MITSUBA_RENDER_SENSOR_H_ */
bool needsTimeSample() const
Does the method sampleRay() require a uniformly distributed sample for the time-dependent component...
Definition: sensor.h:290
Float getXFov() const
Return the horizontal field of view in degrees.
Definition: sensor.h:499
virtual void addChild(const std::string &name, ConfigurableObject *child)
Add a child ConfigurableObject.
bool needsApertureSample() const
Does the method sampleRay() require a uniformly distributed sample for the aperture component...
Definition: sensor.h:303
void setWorldTransform(AnimatedTransform *trafo)
Set the local space to world space transformation.
Definition: emitter.h:300
Generic serializable object, which supports construction from a Properties instance.
Definition: cobject.h:40
Generic sampling record for directions.
Definition: common.h:168
Film * getFilm()
Return the Film instance associated with this sensor.
Definition: sensor.h:306
Abstract radiance/importance emitter interface.
Definition: emitter.h:70
const Transform getViewTransform(Float t) const
Return the world-to-view (aka &quot;view&quot;) transformation at time t.
Definition: sensor.h:398
Float getFarClip() const
Return the far clip plane distance.
Definition: sensor.h:449
Generic sampling record for positions.
Definition: common.h:82
Float getShutterOpen() const
Return the time value of the shutter opening event.
Definition: sensor.h:275
Abstract sensor interface.
Definition: sensor.h:66
Base class of all sample generators.
Definition: sampler.h:66
Perspective camera interface.
Definition: sensor.h:492
Float getShutterOpenTime() const
Return the length, for which the shutter remains open.
Definition: sensor.h:281
Sampler * getSampler()
Return the sensor&#39;s sample generator.
Definition: sensor.h:323
#define MTS_NAMESPACE_BEGIN
Definition: platform.h:137
Transform inverse() const
Return the inverse transform.
Definition: transform.h:63
const Transform & eval(Float t) const
Compute the transformation for the specified time value.
Definition: track.h:402
const AnimatedTransform * getWorldTransform() const
Return the local space to world space transformation.
Definition: emitter.h:296
EMeasure
A list of measures that are associated with various sampling methods in Mitsuba.
Definition: common.h:56
Ray differential – enhances the basic ray class with information about the rays of adjacent pixels on...
Definition: ray.h:140
Projective camera interface.
Definition: sensor.h:393
Float getAspect() const
Return the aspect ratio of the sensor and its underlying film.
Definition: sensor.h:312
Float getNearClip() const
Return the near clip plane distance.
Definition: sensor.h:443
void addChild(ConfigurableObject *child)
Add an unnamed child.
Definition: sensor.h:348
Abstract seekable stream class.
Definition: stream.h:58
#define MTS_DECLARE_CLASS()
This macro must be used in the initial definition in classes that derive from Object.
Definition: class.h:158
ESensorFlags
This list of flags is used to additionally characterize and classify the response functions of differ...
Definition: sensor.h:74
const Film * getFilm() const
Return the Film instance associated with this sensor (const)
Definition: sensor.h:309
Definition: fwd.h:99
Abstract film base class - used to store samples generated by Integrator implementations.
Definition: film.h:37
virtual void configure()
Configure the object (called once after construction and addition of all child ConfigurableObject ins...
Reference counting helper.
Definition: ref.h:40
Encapsulates a 4x4 linear transformation and its inverse.
Definition: transform.h:33
Definition: fwd.h:65
Definition: fwd.h:96
void setShutterOpen(Float time)
Set the time value of the shutter opening event.
Definition: sensor.h:278
const Transform getWorldTransform(Float t) const
Return the view-to-world transformation at time t.
Definition: sensor.h:403
virtual void serialize(Stream *stream, InstanceManager *manager) const
Serialize this emitter to a binary data stream.
Animated transformation with an underlying keyframe representation.
Definition: track.h:335
Associative parameter map for constructing subclasses of ConfigurableObject.
Definition: properties.h:46
Container for all information related to a surface intersection.
Definition: shape.h:36
Coordinates the serialization and unserialization of object graphs.
Definition: serialization.h:65
const Sampler * getSampler() const
Return the sensor&#39;s sampler (const version).
Definition: sensor.h:334
#define MTS_EXPORT_RENDER
Definition: platform.h:109
Discrete spectral power distribution based on a number of wavelength bins over the 360-830 nm range...
Definition: spectrum.h:663
Definition: fwd.h:95
#define MTS_NAMESPACE_END
Definition: platform.h:138
Float getFocusDistance() const
Return the distance to the focal plane.
Definition: sensor.h:455
Float sampleTime(Float sample) const
Importance sample the temporal part of the sensor response function.
Definition: sensor.h:201