{"_id":"@polymer/app-media","_rev":"42-2f7178c31fff7ffada8f07ea1dc3dbed","name":"@polymer/app-media","dist-tags":{"next":"3.0.0-pre.20","latest":"3.0.1"},"versions":{"3.0.0-pre.6":{"name":"@polymer/app-media","version":"3.0.0-pre.6","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.6","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"contributors":[{"name":"The Polymer Authors"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"0902b44169313f125bb691935ed600920f958543","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.6.tgz","integrity":"sha512-3gVt91f+dqgys4xX6sGebB4IjNWIcr89anAiXYLMouVHxajaoxI3B1DqoEVAV5GdEO6owtYO8f338pbC93RhXw==","signatures":[{"sig":"MEYCIQDg70RNLXbnUy1WCG8NQDquo12tH8nHhZT66aSg4gzRZwIhAO0nDu8RIWri+fVlOWeAsAFgrRo8BKbfIa++8gT9YMII","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}]},"flat":true,"main":"app-media.html","gitHead":"9d4cda00c4c17ba2f875a5163fa76123ae9f3ae7","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"5.6.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2"},"_nodeVersion":"9.4.0","dependencies":{"@polymer/polymer":"^3.0.0-pre.6","@polymer/iron-resizable-behavior":"^3.0.0-pre.6"},"devDependencies":{"wct-browser-legacy":"0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.6","@polymer/iron-component-page":"^3.0.0-pre.6","@webcomponents/webcomponentsjs":"^1.0.0"},"_npmOperationalInternal":{"tmp":"tmp/app-media-3.0.0-pre.6.tgz_1516836680419_0.1273038093931973","host":"s3://npm-registry-packages"}},"3.0.0-pre.7":{"name":"@polymer/app-media","version":"3.0.0-pre.7","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.7","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"contributors":[{"name":"The Polymer Authors"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"27de6329ee17cec43f164f171382233a9698471c","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.7.tgz","integrity":"sha512-J6KZZF+/B0WjOKeyaTCT2Oo10fw6b0DROoRCl24z+v531jsH2ArvvmHgyjA0VoxjXqrqyT50ZRAc2DoyPaKoFQ==","signatures":[{"sig":"MEQCIAYbrX+wUu+HARlgnalrsStvCtroYcfMRCzzF6snJ/ghAiAP544XkkSiwGmlKtFXi8pxoULnuUsDiqLXOR+H49Of8A==","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}]},"flat":true,"main":"app-media.html","gitHead":"2442199789fc55370887bea2fc2beedb63c18730","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"5.6.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2"},"_nodeVersion":"9.4.0","dependencies":{"@polymer/polymer":"^3.0.0-pre.7","@polymer/iron-resizable-behavior":"^3.0.0-pre.7"},"devDependencies":{"wct-browser-legacy":"0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.7","@polymer/iron-component-page":"^3.0.0-pre.7","@webcomponents/webcomponentsjs":"^1.0.0"},"_npmOperationalInternal":{"tmp":"tmp/app-media-3.0.0-pre.7.tgz_1517338108219_0.13449026946909726","host":"s3://npm-registry-packages"}},"3.0.0-pre.8":{"name":"@polymer/app-media","version":"3.0.0-pre.8","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.8","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"contributors":[{"name":"The Polymer Authors"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"7ec56039f0b6a443925c0b2145f3defb0b7e3158","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.8.tgz","fileCount":25,"integrity":"sha512-Xc7h8dgeQSV0xnI8RzuWpz6F/mCBfL0LloeY2aLxZv+wamrc3JgrjeX8uTJqDAXnOAvhLnO5VXwpS/lSfyEI9Q==","signatures":[{"sig":"MEYCIQDE6hhWJI8rANH3U7986c+AS/uZstB5VFdlQCoNjBV3TQIhAJ5N6bXxsYsDiKrsXo29nqjmDv2AYaqB1S+lHQcw0AOr","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":100969},"flat":true,"main":"app-media.html","readme":"## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n### Introduction\n\nModern web browsers support a set of APIs called\n[Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/). These\nAPIs allow you to access inputs such as microphones and cameras, and to a\nlimited extent they also let you record and visualize the data.\n\nBrowsers of yet greater modernity support an API called\n[MediaStream Recording](https://www.w3.org/TR/mediastream-recording/). This\nAPI makes recording and processing the data from these inputs really easy and\nfun.\n\nApp Media is a series of elements that wrap these APIs. The intention is to make\nit easier and more fun to build apps and websites that incorporate video and\naudio recording and visualization, while relying on standardized, highly\nperformant browser APIs.\n\n#### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by the\n[WebRTC polyfill](https://github.com/webrtc/adapter). The following\ntable documents browser support for the elements in this collection with the\nWebRTC polyfill in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚫 |      🚧 |    🚧 |    🚫\n\n### How to use\n\nMany apps that access cameras and microphones may wish to start by discovering\nwhat is possible on the current device. A laptop typically has only one camera,\nbut a phone often has one or two. A microphone is often present on devices these\ndays, but it's good to know for sure.\n\nBefore you begin, make sure that you are loading the\n[WebRTC polyfill](https://github.com/webrtc/adapter) where appropriate so that\nthe most up to date versions of the necessary APIs are usable in all of your\ntarget browsers.\n\n### `<app-media-devices>`\n\n`app-media` offers the `app-media-devices` element to assist in looking up the\navailable cameras, microphones and other inputs on the current device. You can\nconfigure it with a string that can be matched against the kind of device you\nwish to look up, and the element will do the rest. Here is an example that\nbinds an array of all available microphone-like devices to a property called\n`microphones`:\n\n```html\n<app-media-devices kind=\"audioinput\" devices=\"{{microphones}}\">\n</app-media-devices>\n```\n\nIn the example, the available devices are filtered to those that have the string\n`'audioinput'` in their `kind` field. It is often convenient to refer to a\nsingle selected device. This can be done using the `selected-device` property,\nwhich points to a single device in the list at a time:\n\n```html\n<app-media-devices kind=\"audioinput\" selected-device=\"{{microphone}}\">\n</app-media-devices>\n```\n\n### `<app-media-stream>`\n\nOnce you have found a device that you like, you'll need to access a\n`MediaStream` of the input from the device. The `app-media-stream` makes it\neasy to convert a device reference to a `MediaStream`:\n\n```html\n<app-media-stream audio-device=\"[[microphone]]\" stream=\"{{microphoneStream}}\">\n</app-media-stream>\n```\n\nHowever, sometimes you don't know which device you want to use. The Media\nCapture and Streams API allows users to\n[specify constraints](https://w3c.github.io/mediacapture-main/#constrainable-properties)\nrelated to the input device. For example, if you wish to access a camera stream,\nand you would prefer to get the back-facing camera if available, you could do\nsomething like this:\n\n```html\n<app-media-stream\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{backFacingCameraStream}}\">\n</app-media-stream>\n```\n\nYou can use `app-media-stream` to record a device screen.\n\nScreen sharing in Chrome and Firefox has some differences. See\n[this](https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/#why-screen-fails)\npage for more info.\n\nTo capture the screen in Chrome use `{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}`\nvideo constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nNOTE: As of today (April 23th, 2017), screen capturing in Chrome is available only on\nAndroid and requires enabling `chrome://flags#enable-usermedia-screen-capturing` flag.\n\nTo capture the screen in Firefox use `{\"mediaSource\": \"screen\"}` video constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mediaSource\": \"screen\"}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nYou can also use `{\"mediaSource\": \"window\"}` to capture only application window\nand `{\"mediaSource\": \"application\"}` to capture all application windows,\nnot the whole screen.\n\nNOTE: Firefox (before version 52) requires to set `media.getusermedia.screensharing.enabled`\nto `true` and add the web app domain to `media.getusermedia.screensharing.allowed_domains`\nin `about:config`.\n\nIt's easy to create a stream that contains both audio and video tracks as well.\nAny combination of devices and constraints can be used when configuring:\n\n```html\n<app-media-stream\n    audio-device=\"[[microphone]]\"\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{cameraAndMicrophoneStream}}\">\n</app-media-stream>\n```\n\nNOTE: Chrome doesn't support combining screen capture video tracks with audio tracks.\n\n### `<app-media-video>`\n\nSuppose you are planning to build an awesome camera app. At some point, you will\nneed to convert your `MediaStream` instance into video that the user can see, so\nthat she knows what is being recorded. Conveniently, you don't need a special\nelement to make this work. You can actually just use a basic `<video>` element:\n\n```html\n<video src-object=\"[[backFacingCameraStream]]\" autoplay></video>\n```\n\nOnce the `backFacingCameraStream` is available, the `<video>` element will\ndisplay video from the camera. But, without further intervention, the video will\nchange its size to be the pixel dimensions of the incoming video feed. If you\nare building a camera app, you may want a video that scales predictably inside\nof its container. An easy way to get this is to use `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay>\n</app-media-video>\n```\n\nBy default, `<app-media-video>` will automatically scale the video so that it\nis \"full bleed\" relative to the dimensions of the `<app-media-video>` element.\nIt can also be configured to scale the video so that it is contained instead of\ncropped by the boundary of `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay contain>\n</app-media-video>\n```\n\nNote that when using a combined stream of camera and microphone data, you may\nwish to mute the video in order to avoid creating a feedback loop.\n\n### `<app-media-recorder>`\n\nEventually you will want to record actual video and audio from the\n`MediaStream`. This is where the MediaStream Recording API comes in, and there\nis an element to make it nice and declarative called `<app-media-recorder>`.\nIn order to use it, configure the element with an optional duration and bind the\nstream to it:\n\n```html\n<app-media-recorder\n    id=\"recorder\"\n    stream=\"[[cameraAndMicrophoneStream]]\"\n    data=\"{{recordedVideo}}\"\n    duration=\"3000\">\n</app-media-recorder>\n```\n\nWhen you are ready to make a recording, call the `start` method on the element:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  createRecording: function() {\n    this.$.recorder.start();\n  }\n\n  // ....\n});\n</script>\n```\n\nThe `<app-media-recorder>` will start recording from the configured stream and\nautomatically stop after the configured duration. While the recording is taking\nplace, the element will dispatch `app-media-recorder-chunk` events that contain\nindividual data chunks as provided by `MediaRecorder` it the `dataavailable`\nevent. Once the recording is available, it will assign it to the `data` property\n(this will also update the bound `recordedVideo` property in the example above).\n\nIf you don't configure a `duration`, then the recording will continue until you\ncall the `stop` method on the recorder instance.\n\n### `<app-media-image-capture>`\n\nAn emerging standard defines the\n[Image Capture API](https://w3c.github.io/mediacapture-image/), which allows for\nmore fine-grained control of camera settings such as color temperature, white\nbalance, focus and flash. It also allows for direct JPEG capture of the image\nthat appears in a given media device.\n\nThe `<app-media-image-capture>` element offers a declarative strategy for\nconfiguring an `ImageCapture` instance and accessing the photos it takes:\n\n```html\n<app-media-image-capture\n    id=\"imageCapture\"\n    stream=\"[[videoStream]]\"\n    focus-mode=\"single-shot\"\n    red-eye-reduction\n    last-photo=\"{{photo}}\">\n</app-media-image-capture>\n```\n\nWhen you are ready to capture a photo, call the `takePhoto` method:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  takePhoto: function() {\n    // NOTE: This method also returns a promise that resolves the photo.\n    this.$.imageCapture.takePhoto();\n  }\n\n  // ....\n});\n</script>\n```\n\n### `<app-media-audio>`\n\nIf you are building a voice memo app, you may wish to access an audio analyzer\nso that you can visualize microphone input in real time. This can be done with\nthe `<app-media-audio>` element:\n\n```html\n<app-media-audio\n    stream=\"[[microphoneStream]]\"\n    analyser=\"{{microphoneAnalyser}}\">\n</app-media-audio>\n```\n\nWhen the `microphoneStream` becomes available, the `microphoneAnalyser` property\nwill be assigned an instance of a Web Audio\n[`AnalyserNode`](https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode)\nthat corresponds to the audio input from that stream. Any stream with at least\nonce audio track can be used as an input for `<app-media-audio>`.\n\n### `<app-media-waveform>`\n\nThere are many kinds of visualization that might be useful for demonstrating to\nyour users that there is a hot mic on their devices. `<app-media-waveform>` is\na basic SVG visualization that can suit a wide-range of visualization needs. It\nis very easy to use if you have an `AnalyzerNode` instance:\n\n```html\n<app-media-waveform analyser=\"[[microphoneAnalyser]]\">\n</app-media-waveform>\n```\n\nThe analyzer is minimal, but its foreground and background can be themed to\nachieve some level of customized look and feel:\n\n```html\n<style>\n  :host {\n    --app-media-waveform-background-color: red;\n    --app-media-waveform-foreground-color: lightblue;\n  }\n</style>\n```\n","gitHead":"15b35d0c813df2eacae34d8ec3f15f8ea20d7438","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"5.5.1","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2"},"_nodeVersion":"9.2.0","dependencies":{"@polymer/polymer":"^3.0.0-pre.7","@polymer/iron-resizable-behavior":"^3.0.0-pre.7"},"_hasShrinkwrap":false,"readmeFilename":"README.md","devDependencies":{"wct-browser-legacy":"0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.7","@polymer/iron-component-page":"^3.0.0-pre.7","@webcomponents/webcomponentsjs":"^1.0.0"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.8_1518029783940_0.16689586889772534","host":"s3://npm-registry-packages"}},"3.0.0-pre.10":{"name":"@polymer/app-media","version":"3.0.0-pre.10","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.10","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"contributors":[{"name":"The Polymer Authors"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"ff7b0d9ae0d0f211a4634bc3a0ccddc7d418b144","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.10.tgz","fileCount":17,"integrity":"sha512-Jq0+m1nB/6FGTd136GN5f76EnGvexE/eZm1Q7qkdEzrPAEdWrDJzKo4iYTTfYRFyw1qGbo0boKso5Uax3g/AKQ==","signatures":[{"sig":"MEYCIQClf5iBkJ2p/eP9MIJFVxsghRqce7xvFU8KLg6iUEy73gIhAK9OhkWM+Nl65zzi/Lwvw6x0Ihw/PWaVtXH7WZC1EhjX","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":70511},"flat":true,"main":"app-media.html","readme":"## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n### Introduction\n\nModern web browsers support a set of APIs called\n[Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/). These\nAPIs allow you to access inputs such as microphones and cameras, and to a\nlimited extent they also let you record and visualize the data.\n\nBrowsers of yet greater modernity support an API called\n[MediaStream Recording](https://www.w3.org/TR/mediastream-recording/). This\nAPI makes recording and processing the data from these inputs really easy and\nfun.\n\nApp Media is a series of elements that wrap these APIs. The intention is to make\nit easier and more fun to build apps and websites that incorporate video and\naudio recording and visualization, while relying on standardized, highly\nperformant browser APIs.\n\n#### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by the\n[WebRTC polyfill](https://github.com/webrtc/adapter). The following\ntable documents browser support for the elements in this collection with the\nWebRTC polyfill in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚫 |      🚧 |    🚧 |    🚫\n\n### How to use\n\nMany apps that access cameras and microphones may wish to start by discovering\nwhat is possible on the current device. A laptop typically has only one camera,\nbut a phone often has one or two. A microphone is often present on devices these\ndays, but it's good to know for sure.\n\nBefore you begin, make sure that you are loading the\n[WebRTC polyfill](https://github.com/webrtc/adapter) where appropriate so that\nthe most up to date versions of the necessary APIs are usable in all of your\ntarget browsers.\n\n### `<app-media-devices>`\n\n`app-media` offers the `app-media-devices` element to assist in looking up the\navailable cameras, microphones and other inputs on the current device. You can\nconfigure it with a string that can be matched against the kind of device you\nwish to look up, and the element will do the rest. Here is an example that\nbinds an array of all available microphone-like devices to a property called\n`microphones`:\n\n```html\n<app-media-devices kind=\"audioinput\" devices=\"{{microphones}}\">\n</app-media-devices>\n```\n\nIn the example, the available devices are filtered to those that have the string\n`'audioinput'` in their `kind` field. It is often convenient to refer to a\nsingle selected device. This can be done using the `selected-device` property,\nwhich points to a single device in the list at a time:\n\n```html\n<app-media-devices kind=\"audioinput\" selected-device=\"{{microphone}}\">\n</app-media-devices>\n```\n\n### `<app-media-stream>`\n\nOnce you have found a device that you like, you'll need to access a\n`MediaStream` of the input from the device. The `app-media-stream` makes it\neasy to convert a device reference to a `MediaStream`:\n\n```html\n<app-media-stream audio-device=\"[[microphone]]\" stream=\"{{microphoneStream}}\">\n</app-media-stream>\n```\n\nHowever, sometimes you don't know which device you want to use. The Media\nCapture and Streams API allows users to\n[specify constraints](https://w3c.github.io/mediacapture-main/#constrainable-properties)\nrelated to the input device. For example, if you wish to access a camera stream,\nand you would prefer to get the back-facing camera if available, you could do\nsomething like this:\n\n```html\n<app-media-stream\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{backFacingCameraStream}}\">\n</app-media-stream>\n```\n\nYou can use `app-media-stream` to record a device screen.\n\nScreen sharing in Chrome and Firefox has some differences. See\n[this](https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/#why-screen-fails)\npage for more info.\n\nTo capture the screen in Chrome use `{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}`\nvideo constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nNOTE: As of today (April 23th, 2017), screen capturing in Chrome is available only on\nAndroid and requires enabling `chrome://flags#enable-usermedia-screen-capturing` flag.\n\nTo capture the screen in Firefox use `{\"mediaSource\": \"screen\"}` video constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mediaSource\": \"screen\"}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nYou can also use `{\"mediaSource\": \"window\"}` to capture only application window\nand `{\"mediaSource\": \"application\"}` to capture all application windows,\nnot the whole screen.\n\nNOTE: Firefox (before version 52) requires to set `media.getusermedia.screensharing.enabled`\nto `true` and add the web app domain to `media.getusermedia.screensharing.allowed_domains`\nin `about:config`.\n\nIt's easy to create a stream that contains both audio and video tracks as well.\nAny combination of devices and constraints can be used when configuring:\n\n```html\n<app-media-stream\n    audio-device=\"[[microphone]]\"\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{cameraAndMicrophoneStream}}\">\n</app-media-stream>\n```\n\nNOTE: Chrome doesn't support combining screen capture video tracks with audio tracks.\n\n### `<app-media-video>`\n\nSuppose you are planning to build an awesome camera app. At some point, you will\nneed to convert your `MediaStream` instance into video that the user can see, so\nthat she knows what is being recorded. Conveniently, you don't need a special\nelement to make this work. You can actually just use a basic `<video>` element:\n\n```html\n<video src-object=\"[[backFacingCameraStream]]\" autoplay></video>\n```\n\nOnce the `backFacingCameraStream` is available, the `<video>` element will\ndisplay video from the camera. But, without further intervention, the video will\nchange its size to be the pixel dimensions of the incoming video feed. If you\nare building a camera app, you may want a video that scales predictably inside\nof its container. An easy way to get this is to use `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay>\n</app-media-video>\n```\n\nBy default, `<app-media-video>` will automatically scale the video so that it\nis \"full bleed\" relative to the dimensions of the `<app-media-video>` element.\nIt can also be configured to scale the video so that it is contained instead of\ncropped by the boundary of `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay contain>\n</app-media-video>\n```\n\nNote that when using a combined stream of camera and microphone data, you may\nwish to mute the video in order to avoid creating a feedback loop.\n\n### `<app-media-recorder>`\n\nEventually you will want to record actual video and audio from the\n`MediaStream`. This is where the MediaStream Recording API comes in, and there\nis an element to make it nice and declarative called `<app-media-recorder>`.\nIn order to use it, configure the element with an optional duration and bind the\nstream to it:\n\n```html\n<app-media-recorder\n    id=\"recorder\"\n    stream=\"[[cameraAndMicrophoneStream]]\"\n    data=\"{{recordedVideo}}\"\n    duration=\"3000\">\n</app-media-recorder>\n```\n\nWhen you are ready to make a recording, call the `start` method on the element:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  createRecording: function() {\n    this.$.recorder.start();\n  }\n\n  // ....\n});\n</script>\n```\n\nThe `<app-media-recorder>` will start recording from the configured stream and\nautomatically stop after the configured duration. While the recording is taking\nplace, the element will dispatch `app-media-recorder-chunk` events that contain\nindividual data chunks as provided by `MediaRecorder` it the `dataavailable`\nevent. Once the recording is available, it will assign it to the `data` property\n(this will also update the bound `recordedVideo` property in the example above).\n\nIf you don't configure a `duration`, then the recording will continue until you\ncall the `stop` method on the recorder instance.\n\n### `<app-media-image-capture>`\n\nAn emerging standard defines the\n[Image Capture API](https://w3c.github.io/mediacapture-image/), which allows for\nmore fine-grained control of camera settings such as color temperature, white\nbalance, focus and flash. It also allows for direct JPEG capture of the image\nthat appears in a given media device.\n\nThe `<app-media-image-capture>` element offers a declarative strategy for\nconfiguring an `ImageCapture` instance and accessing the photos it takes:\n\n```html\n<app-media-image-capture\n    id=\"imageCapture\"\n    stream=\"[[videoStream]]\"\n    focus-mode=\"single-shot\"\n    red-eye-reduction\n    last-photo=\"{{photo}}\">\n</app-media-image-capture>\n```\n\nWhen you are ready to capture a photo, call the `takePhoto` method:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  takePhoto: function() {\n    // NOTE: This method also returns a promise that resolves the photo.\n    this.$.imageCapture.takePhoto();\n  }\n\n  // ....\n});\n</script>\n```\n\n### `<app-media-audio>`\n\nIf you are building a voice memo app, you may wish to access an audio analyzer\nso that you can visualize microphone input in real time. This can be done with\nthe `<app-media-audio>` element:\n\n```html\n<app-media-audio\n    stream=\"[[microphoneStream]]\"\n    analyser=\"{{microphoneAnalyser}}\">\n</app-media-audio>\n```\n\nWhen the `microphoneStream` becomes available, the `microphoneAnalyser` property\nwill be assigned an instance of a Web Audio\n[`AnalyserNode`](https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode)\nthat corresponds to the audio input from that stream. Any stream with at least\nonce audio track can be used as an input for `<app-media-audio>`.\n\n### `<app-media-waveform>`\n\nThere are many kinds of visualization that might be useful for demonstrating to\nyour users that there is a hot mic on their devices. `<app-media-waveform>` is\na basic SVG visualization that can suit a wide-range of visualization needs. It\nis very easy to use if you have an `AnalyzerNode` instance:\n\n```html\n<app-media-waveform analyser=\"[[microphoneAnalyser]]\">\n</app-media-waveform>\n```\n\nThe analyzer is minimal, but its foreground and background can be themed to\nachieve some level of customized look and feel:\n\n```html\n<style>\n  :host {\n    --app-media-waveform-background-color: red;\n    --app-media-waveform-foreground-color: lightblue;\n  }\n</style>\n```\n","gitHead":"e0f08e159427abf7a5047fd57742d99078f79dc9","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"5.4.1","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2"},"_nodeVersion":"8.2.1","dependencies":{"@polymer/polymer":"^3.0.0-pre.10","@polymer/iron-resizable-behavior":"^3.0.0-pre.10"},"_hasShrinkwrap":false,"readmeFilename":"README.md","devDependencies":{"wct-browser-legacy":"0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.10","@polymer/iron-component-page":"^3.0.0-pre.10","@webcomponents/webcomponentsjs":"^1.0.0"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.10_1519341312316_0.7964692325447722","host":"s3://npm-registry-packages"}},"3.0.0-pre.11":{"name":"@polymer/app-media","version":"3.0.0-pre.11","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.11","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"3dd4fda5d74ee9af19568336962495f8f1d4f736","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.11.tgz","fileCount":24,"integrity":"sha512-s0OQdsYhWiz6nlN1tTJy4Knh1KrRxOvrvup+T36TFzhA+vfyxl7tNep40yAsucOemdOSGQL/CjBk0kXXDdL3BA==","signatures":[{"sig":"MEQCIB3fWU+1Tl+DPvr6EYbqnyl7Gb8egcob2rozUGpJFc6bAiB6faNQ0lVCIpZijxn+Z6K4qVr51sNgVCudx1JdA3QujA==","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":99886},"main":"app-media.js","readme":"## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n### Introduction\n\nModern web browsers support a set of APIs called\n[Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/). These\nAPIs allow you to access inputs such as microphones and cameras, and to a\nlimited extent they also let you record and visualize the data.\n\nBrowsers of yet greater modernity support an API called\n[MediaStream Recording](https://www.w3.org/TR/mediastream-recording/). This\nAPI makes recording and processing the data from these inputs really easy and\nfun.\n\nApp Media is a series of elements that wrap these APIs. The intention is to make\nit easier and more fun to build apps and websites that incorporate video and\naudio recording and visualization, while relying on standardized, highly\nperformant browser APIs.\n\n#### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by the\n[WebRTC polyfill](https://github.com/webrtc/adapter). The following\ntable documents browser support for the elements in this collection with the\nWebRTC polyfill in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚫 |      🚧 |    🚧 |    🚫\n\n### How to use\n\nMany apps that access cameras and microphones may wish to start by discovering\nwhat is possible on the current device. A laptop typically has only one camera,\nbut a phone often has one or two. A microphone is often present on devices these\ndays, but it's good to know for sure.\n\nBefore you begin, make sure that you are loading the\n[WebRTC polyfill](https://github.com/webrtc/adapter) where appropriate so that\nthe most up to date versions of the necessary APIs are usable in all of your\ntarget browsers.\n\n### `<app-media-devices>`\n\n`app-media` offers the `app-media-devices` element to assist in looking up the\navailable cameras, microphones and other inputs on the current device. You can\nconfigure it with a string that can be matched against the kind of device you\nwish to look up, and the element will do the rest. Here is an example that\nbinds an array of all available microphone-like devices to a property called\n`microphones`:\n\n```html\n<app-media-devices kind=\"audioinput\" devices=\"{{microphones}}\">\n</app-media-devices>\n```\n\nIn the example, the available devices are filtered to those that have the string\n`'audioinput'` in their `kind` field. It is often convenient to refer to a\nsingle selected device. This can be done using the `selected-device` property,\nwhich points to a single device in the list at a time:\n\n```html\n<app-media-devices kind=\"audioinput\" selected-device=\"{{microphone}}\">\n</app-media-devices>\n```\n\n### `<app-media-stream>`\n\nOnce you have found a device that you like, you'll need to access a\n`MediaStream` of the input from the device. The `app-media-stream` makes it\neasy to convert a device reference to a `MediaStream`:\n\n```html\n<app-media-stream audio-device=\"[[microphone]]\" stream=\"{{microphoneStream}}\">\n</app-media-stream>\n```\n\nHowever, sometimes you don't know which device you want to use. The Media\nCapture and Streams API allows users to\n[specify constraints](https://w3c.github.io/mediacapture-main/#constrainable-properties)\nrelated to the input device. For example, if you wish to access a camera stream,\nand you would prefer to get the back-facing camera if available, you could do\nsomething like this:\n\n```html\n<app-media-stream\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{backFacingCameraStream}}\">\n</app-media-stream>\n```\n\nYou can use `app-media-stream` to record a device screen.\n\nScreen sharing in Chrome and Firefox has some differences. See\n[this](https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/#why-screen-fails)\npage for more info.\n\nTo capture the screen in Chrome use `{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}`\nvideo constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nNOTE: As of today (April 23th, 2017), screen capturing in Chrome is available only on\nAndroid and requires enabling `chrome://flags#enable-usermedia-screen-capturing` flag.\n\nTo capture the screen in Firefox use `{\"mediaSource\": \"screen\"}` video constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mediaSource\": \"screen\"}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nYou can also use `{\"mediaSource\": \"window\"}` to capture only application window\nand `{\"mediaSource\": \"application\"}` to capture all application windows,\nnot the whole screen.\n\nNOTE: Firefox (before version 52) requires to set `media.getusermedia.screensharing.enabled`\nto `true` and add the web app domain to `media.getusermedia.screensharing.allowed_domains`\nin `about:config`.\n\nIt's easy to create a stream that contains both audio and video tracks as well.\nAny combination of devices and constraints can be used when configuring:\n\n```html\n<app-media-stream\n    audio-device=\"[[microphone]]\"\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{cameraAndMicrophoneStream}}\">\n</app-media-stream>\n```\n\nNOTE: Chrome doesn't support combining screen capture video tracks with audio tracks.\n\n### `<app-media-video>`\n\nSuppose you are planning to build an awesome camera app. At some point, you will\nneed to convert your `MediaStream` instance into video that the user can see, so\nthat she knows what is being recorded. Conveniently, you don't need a special\nelement to make this work. You can actually just use a basic `<video>` element:\n\n```html\n<video src-object=\"[[backFacingCameraStream]]\" autoplay></video>\n```\n\nOnce the `backFacingCameraStream` is available, the `<video>` element will\ndisplay video from the camera. But, without further intervention, the video will\nchange its size to be the pixel dimensions of the incoming video feed. If you\nare building a camera app, you may want a video that scales predictably inside\nof its container. An easy way to get this is to use `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay>\n</app-media-video>\n```\n\nBy default, `<app-media-video>` will automatically scale the video so that it\nis \"full bleed\" relative to the dimensions of the `<app-media-video>` element.\nIt can also be configured to scale the video so that it is contained instead of\ncropped by the boundary of `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay contain>\n</app-media-video>\n```\n\nNote that when using a combined stream of camera and microphone data, you may\nwish to mute the video in order to avoid creating a feedback loop.\n\n### `<app-media-recorder>`\n\nEventually you will want to record actual video and audio from the\n`MediaStream`. This is where the MediaStream Recording API comes in, and there\nis an element to make it nice and declarative called `<app-media-recorder>`.\nIn order to use it, configure the element with an optional duration and bind the\nstream to it:\n\n```html\n<app-media-recorder\n    id=\"recorder\"\n    stream=\"[[cameraAndMicrophoneStream]]\"\n    data=\"{{recordedVideo}}\"\n    duration=\"3000\">\n</app-media-recorder>\n```\n\nWhen you are ready to make a recording, call the `start` method on the element:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  createRecording: function() {\n    this.$.recorder.start();\n  }\n\n  // ....\n});\n</script>\n```\n\nThe `<app-media-recorder>` will start recording from the configured stream and\nautomatically stop after the configured duration. While the recording is taking\nplace, the element will dispatch `app-media-recorder-chunk` events that contain\nindividual data chunks as provided by `MediaRecorder` it the `dataavailable`\nevent. Once the recording is available, it will assign it to the `data` property\n(this will also update the bound `recordedVideo` property in the example above).\n\nIf you don't configure a `duration`, then the recording will continue until you\ncall the `stop` method on the recorder instance.\n\n### `<app-media-image-capture>`\n\nAn emerging standard defines the\n[Image Capture API](https://w3c.github.io/mediacapture-image/), which allows for\nmore fine-grained control of camera settings such as color temperature, white\nbalance, focus and flash. It also allows for direct JPEG capture of the image\nthat appears in a given media device.\n\nThe `<app-media-image-capture>` element offers a declarative strategy for\nconfiguring an `ImageCapture` instance and accessing the photos it takes:\n\n```html\n<app-media-image-capture\n    id=\"imageCapture\"\n    stream=\"[[videoStream]]\"\n    focus-mode=\"single-shot\"\n    red-eye-reduction\n    last-photo=\"{{photo}}\">\n</app-media-image-capture>\n```\n\nWhen you are ready to capture a photo, call the `takePhoto` method:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  takePhoto: function() {\n    // NOTE: This method also returns a promise that resolves the photo.\n    this.$.imageCapture.takePhoto();\n  }\n\n  // ....\n});\n</script>\n```\n\n### `<app-media-audio>`\n\nIf you are building a voice memo app, you may wish to access an audio analyzer\nso that you can visualize microphone input in real time. This can be done with\nthe `<app-media-audio>` element:\n\n```html\n<app-media-audio\n    stream=\"[[microphoneStream]]\"\n    analyser=\"{{microphoneAnalyser}}\">\n</app-media-audio>\n```\n\nWhen the `microphoneStream` becomes available, the `microphoneAnalyser` property\nwill be assigned an instance of a Web Audio\n[`AnalyserNode`](https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode)\nthat corresponds to the audio input from that stream. Any stream with at least\nonce audio track can be used as an input for `<app-media-audio>`.\n\n### `<app-media-waveform>`\n\nThere are many kinds of visualization that might be useful for demonstrating to\nyour users that there is a hot mic on their devices. `<app-media-waveform>` is\na basic SVG visualization that can suit a wide-range of visualization needs. It\nis very easy to use if you have an `AnalyzerNode` instance:\n\n```html\n<app-media-waveform analyser=\"[[microphoneAnalyser]]\">\n</app-media-waveform>\n```\n\nThe analyzer is minimal, but its foreground and background can be themed to\nachieve some level of customized look and feel:\n\n```html\n<style>\n  :host {\n    --app-media-waveform-background-color: red;\n    --app-media-waveform-foreground-color: lightblue;\n  }\n</style>\n```\n","gitHead":"17f14bf815a6635efe202f8b446d29087142fc62","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"5.6.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2"},"_nodeVersion":"9.7.1","dependencies":{"@polymer/polymer":"^3.0.0-pre.10","@polymer/iron-resizable-behavior":"^3.0.0-pre.10"},"_hasShrinkwrap":false,"readmeFilename":"README.md","devDependencies":{"wct-browser-legacy":"0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.10","@polymer/iron-component-page":"^3.0.0-pre.10","@webcomponents/webcomponentsjs":"^1.0.0"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.11_1520558226286_0.5066938588818244","host":"s3://npm-registry-packages"}},"3.0.0-pre.12":{"name":"@polymer/app-media","version":"3.0.0-pre.12","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.12","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"d683c202fcaca9e7cc6b1f84440d906e64d05f11","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.12.tgz","fileCount":25,"integrity":"sha512-GiCane6oMzJlZPvsIn8E7Cd+doklQmCVAjPkSae3D0LftdlMkYQJFd5SfhqmNv76xajDqdoJMgZYoOUD2HARew==","signatures":[{"sig":"MEYCIQDAdpoTV49mNOPUDoP24zX5hS2FC5xJp5ge9h/G+H+/pwIhAKkIZF5+qWaop+bggeD6t1IAwyHzksWksdegrzfRvNkv","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":102608},"main":"app-media.js","readme":"## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n### Introduction\n\nModern web browsers support a set of APIs called\n[Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/). These\nAPIs allow you to access inputs such as microphones and cameras, and to a\nlimited extent they also let you record and visualize the data.\n\nBrowsers of yet greater modernity support an API called\n[MediaStream Recording](https://www.w3.org/TR/mediastream-recording/). This\nAPI makes recording and processing the data from these inputs really easy and\nfun.\n\nApp Media is a series of elements that wrap these APIs. The intention is to make\nit easier and more fun to build apps and websites that incorporate video and\naudio recording and visualization, while relying on standardized, highly\nperformant browser APIs.\n\n#### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by\n[WebRTC polyfill](https://github.com/webrtc/adapter) and\n[MediaStream ImageCapture API polyfill](https://github.com/GoogleChromeLabs/imagecapture-polyfill).\nThe following table documents browser support for the elements in this collection with\nthese polyfills in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚧 |      🚧 |    🚧 |    🚫\n\n### How to use\n\nMany apps that access cameras and microphones may wish to start by discovering\nwhat is possible on the current device. A laptop typically has only one camera,\nbut a phone often has one or two. A microphone is often present on devices these\ndays, but it's good to know for sure.\n\nBefore you begin, make sure that you are loading the\n[WebRTC polyfill](https://github.com/webrtc/adapter) where appropriate so that\nthe most up to date versions of the necessary APIs are usable in all of your\ntarget browsers.\n\n### `<app-media-devices>`\n\n`app-media` offers the `app-media-devices` element to assist in looking up the\navailable cameras, microphones and other inputs on the current device. You can\nconfigure it with a string that can be matched against the kind of device you\nwish to look up, and the element will do the rest. Here is an example that\nbinds an array of all available microphone-like devices to a property called\n`microphones`:\n\n```html\n<app-media-devices kind=\"audioinput\" devices=\"{{microphones}}\">\n</app-media-devices>\n```\n\nIn the example, the available devices are filtered to those that have the string\n`'audioinput'` in their `kind` field. It is often convenient to refer to a\nsingle selected device. This can be done using the `selected-device` property,\nwhich points to a single device in the list at a time:\n\n```html\n<app-media-devices kind=\"audioinput\" selected-device=\"{{microphone}}\">\n</app-media-devices>\n```\n\n### `<app-media-stream>`\n\nOnce you have found a device that you like, you'll need to access a\n`MediaStream` of the input from the device. The `app-media-stream` makes it\neasy to convert a device reference to a `MediaStream`:\n\n```html\n<app-media-stream audio-device=\"[[microphone]]\" stream=\"{{microphoneStream}}\">\n</app-media-stream>\n```\n\nHowever, sometimes you don't know which device you want to use. The Media\nCapture and Streams API allows users to\n[specify constraints](https://w3c.github.io/mediacapture-main/#constrainable-properties)\nrelated to the input device. For example, if you wish to access a camera stream,\nand you would prefer to get the back-facing camera if available, you could do\nsomething like this:\n\n```html\n<app-media-stream\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{backFacingCameraStream}}\">\n</app-media-stream>\n```\n\nYou can use `app-media-stream` to record a device screen.\n\nScreen sharing in Chrome and Firefox has some differences. See\n[this](https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/#why-screen-fails)\npage for more info.\n\nTo capture the screen in Chrome use `{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}`\nvideo constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nNOTE: As of today (April 23th, 2017), screen capturing in Chrome is available only on\nAndroid and requires enabling `chrome://flags#enable-usermedia-screen-capturing` flag.\n\nTo capture the screen in Firefox use `{\"mediaSource\": \"screen\"}` video constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mediaSource\": \"screen\"}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nYou can also use `{\"mediaSource\": \"window\"}` to capture only application window\nand `{\"mediaSource\": \"application\"}` to capture all application windows,\nnot the whole screen.\n\nNOTE: Firefox (before version 52) requires to set `media.getusermedia.screensharing.enabled`\nto `true` and add the web app domain to `media.getusermedia.screensharing.allowed_domains`\nin `about:config`.\n\nIt's easy to create a stream that contains both audio and video tracks as well.\nAny combination of devices and constraints can be used when configuring:\n\n```html\n<app-media-stream\n    audio-device=\"[[microphone]]\"\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{cameraAndMicrophoneStream}}\">\n</app-media-stream>\n```\n\nNOTE: Chrome doesn't support combining screen capture video tracks with audio tracks.\n\n### `<app-media-video>`\n\nSuppose you are planning to build an awesome camera app. At some point, you will\nneed to convert your `MediaStream` instance into video that the user can see, so\nthat she knows what is being recorded. Conveniently, you don't need a special\nelement to make this work. You can actually just use a basic `<video>` element:\n\n```html\n<video src-object=\"[[backFacingCameraStream]]\" autoplay></video>\n```\n\nOnce the `backFacingCameraStream` is available, the `<video>` element will\ndisplay video from the camera. But, without further intervention, the video will\nchange its size to be the pixel dimensions of the incoming video feed. If you\nare building a camera app, you may want a video that scales predictably inside\nof its container. An easy way to get this is to use `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay>\n</app-media-video>\n```\n\nBy default, `<app-media-video>` will automatically scale the video so that it\nis \"full bleed\" relative to the dimensions of the `<app-media-video>` element.\nIt can also be configured to scale the video so that it is contained instead of\ncropped by the boundary of `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay contain>\n</app-media-video>\n```\n\nNote that when using a combined stream of camera and microphone data, you may\nwish to mute the video in order to avoid creating a feedback loop.\n\n### `<app-media-recorder>`\n\nEventually you will want to record actual video and audio from the\n`MediaStream`. This is where the MediaStream Recording API comes in, and there\nis an element to make it nice and declarative called `<app-media-recorder>`.\nIn order to use it, configure the element with an optional duration and bind the\nstream to it:\n\n```html\n<app-media-recorder\n    id=\"recorder\"\n    stream=\"[[cameraAndMicrophoneStream]]\"\n    data=\"{{recordedVideo}}\"\n    duration=\"3000\">\n</app-media-recorder>\n```\n\nWhen you are ready to make a recording, call the `start` method on the element:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  createRecording: function() {\n    this.$.recorder.start();\n  }\n\n  // ....\n});\n</script>\n```\n\nThe `<app-media-recorder>` will start recording from the configured stream and\nautomatically stop after the configured duration. While the recording is taking\nplace, the element will dispatch `app-media-recorder-chunk` events that contain\nindividual data chunks as provided by `MediaRecorder` it the `dataavailable`\nevent. Once the recording is available, it will assign it to the `data` property\n(this will also update the bound `recordedVideo` property in the example above).\n\nIf you don't configure a `duration`, then the recording will continue until you\ncall the `stop` method on the recorder instance.\n\n### `<app-media-image-capture>`\n\nAn emerging standard defines the\n[Image Capture API](https://w3c.github.io/mediacapture-image/), which allows for\nmore fine-grained control of camera settings such as color temperature, white\nbalance, focus and flash. It also allows for direct JPEG capture of the image\nthat appears in a given media device.\n\nThe `<app-media-image-capture>` element offers a declarative strategy for\nconfiguring an `ImageCapture` instance and accessing the photos it takes:\n\n```html\n<app-media-image-capture\n    id=\"imageCapture\"\n    stream=\"[[videoStream]]\"\n    focus-mode=\"single-shot\"\n    red-eye-reduction\n    last-photo=\"{{photo}}\">\n</app-media-image-capture>\n```\n\nWhen you are ready to capture a photo, call the `takePhoto` method:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  takePhoto: function() {\n    // NOTE: This method also returns a promise that resolves the photo.\n    this.$.imageCapture.takePhoto();\n  }\n\n  // ....\n});\n</script>\n```\n\n### `<app-media-audio>`\n\nIf you are building a voice memo app, you may wish to access an audio analyzer\nso that you can visualize microphone input in real time. This can be done with\nthe `<app-media-audio>` element:\n\n```html\n<app-media-audio\n    stream=\"[[microphoneStream]]\"\n    analyser=\"{{microphoneAnalyser}}\">\n</app-media-audio>\n```\n\nWhen the `microphoneStream` becomes available, the `microphoneAnalyser` property\nwill be assigned an instance of a Web Audio\n[`AnalyserNode`](https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode)\nthat corresponds to the audio input from that stream. Any stream with at least\nonce audio track can be used as an input for `<app-media-audio>`.\n\n### `<app-media-waveform>`\n\nThere are many kinds of visualization that might be useful for demonstrating to\nyour users that there is a hot mic on their devices. `<app-media-waveform>` is\na basic SVG visualization that can suit a wide-range of visualization needs. It\nis very easy to use if you have an `AnalyzerNode` instance:\n\n```html\n<app-media-waveform analyser=\"[[microphoneAnalyser]]\">\n</app-media-waveform>\n```\n\nThe analyzer is minimal, but its foreground and background can be themed to\nachieve some level of customized look and feel:\n\n```html\n<style>\n  :host {\n    --app-media-waveform-background-color: red;\n    --app-media-waveform-foreground-color: lightblue;\n  }\n</style>\n```\n","gitHead":"51c01755e1b874d80a57d18533457ba348cea5cd","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"5.5.1","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2"},"_nodeVersion":"9.2.0","dependencies":{"@polymer/polymer":"3.0.0-pre.12","@polymer/iron-resizable-behavior":"3.0.0-pre.12"},"_hasShrinkwrap":false,"readmeFilename":"README.md","devDependencies":{"wct-browser-legacy":"0.0.1-pre.11","@polymer/test-fixture":"3.0.0-pre.12","@polymer/iron-component-page":"3.0.0-pre.12","@webcomponents/webcomponentsjs":"^1.0.0"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.12_1521737788654_0.26499509978688907","host":"s3://npm-registry-packages"}},"3.0.0-pre.13":{"name":"@polymer/app-media","version":"3.0.0-pre.13","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.13","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"d6a8445092b1cca7651345e097276e8cac404e4d","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.13.tgz","fileCount":25,"integrity":"sha512-1MDXuYln8z+8QeEnrO6TSf1jgikmPdUMulz2uA+TEoM1b/KpWHBEg29aDO3EXX20czkXrCd7kaMuu7tW3cL5rg==","signatures":[{"sig":"MEYCIQDmBoAOuamJboFw9pQVIrMGbcXshWIKK/bO8fEHNP47wAIhAKM+tNcmh/EwHJ+duyaOPSK9EBGOlBf53nWxVnLLg6kC","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":111220,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJa6PCFCRA9TVsSAnZWagAA6TEP/1cfDxGqiEdhU25T5ik6\nNjm3/gy4BLRcTjYVGhfIrV22oJy+iyu71a2cfmhZHDDTM8zjYJlmkFs8BENZ\n0pnwWzX1nJSPXAqjs+rI/JlQz3S0Fx/e401j8Agmk/+fqGzffI9yS7k55B6p\naIT2Q0v3+sPajUrpXMTRT8PxVH1mFaKO5pjkw9CopTZ0D5Mku0hTcUCKgNO7\npLCZ0xWc7Suqedt63CcdazRQDon5NPWQPhcMRWJYXkLOLiJwNXbFHZW6mcoh\nGJ8MdBUhwAPoHla9Y5IVNyossg9wuAX7JsYZlt4u9jg9cjhNOghC+2A02Vue\nRgnCC57EK7wVrP3gYNxR5em8l9wI08vIKpY9iPB7vAUP1slHeTtmN87Pmhh1\nnADbVpP4X0MyG6jvzIdyq+ljCyXskLlPf++D4JVwanbLHCerCPn9COlPtBES\neoDB3AvzdP6p74pHmLbrh6wtpkChKTURQVLMUG5m4JSpo9DbZ2qa/N5BGRF4\nliSNNU0blF1+IHaiO/tM8l6hDrWiOpvpneWR46KZZrYjy4rnLJi2fPO06Kvz\nZPmBEX8najjoCny6Za4FmyKqscKtDnVREXox/vb9/uoXFIytBSkYShS6iGbJ\nkh08x61DdTp+9L4Dd/RBkhvY/Zz29dIumMggak6ddIXUvV8t4ts30YY6kTGm\nw8sj\r\n=m0Cv\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","readme":"## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n### Introduction\n\nModern web browsers support a set of APIs called\n[Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/). These\nAPIs allow you to access inputs such as microphones and cameras, and to a\nlimited extent they also let you record and visualize the data.\n\nBrowsers of yet greater modernity support an API called\n[MediaStream Recording](https://www.w3.org/TR/mediastream-recording/). This\nAPI makes recording and processing the data from these inputs really easy and\nfun.\n\nApp Media is a series of elements that wrap these APIs. The intention is to make\nit easier and more fun to build apps and websites that incorporate video and\naudio recording and visualization, while relying on standardized, highly\nperformant browser APIs.\n\n#### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by\n[WebRTC polyfill](https://github.com/webrtc/adapter) and\n[MediaStream ImageCapture API polyfill](https://github.com/GoogleChromeLabs/imagecapture-polyfill).\nThe following table documents browser support for the elements in this collection with\nthese polyfills in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚧 |      🚧 |    🚧 |    🚫\n\n### How to use\n\nMany apps that access cameras and microphones may wish to start by discovering\nwhat is possible on the current device. A laptop typically has only one camera,\nbut a phone often has one or two. A microphone is often present on devices these\ndays, but it's good to know for sure.\n\nBefore you begin, make sure that you are loading the\n[WebRTC polyfill](https://github.com/webrtc/adapter) where appropriate so that\nthe most up to date versions of the necessary APIs are usable in all of your\ntarget browsers.\n\n### `<app-media-devices>`\n\n`app-media` offers the `app-media-devices` element to assist in looking up the\navailable cameras, microphones and other inputs on the current device. You can\nconfigure it with a string that can be matched against the kind of device you\nwish to look up, and the element will do the rest. Here is an example that\nbinds an array of all available microphone-like devices to a property called\n`microphones`:\n\n```html\n<app-media-devices kind=\"audioinput\" devices=\"{{microphones}}\">\n</app-media-devices>\n```\n\nIn the example, the available devices are filtered to those that have the string\n`'audioinput'` in their `kind` field. It is often convenient to refer to a\nsingle selected device. This can be done using the `selected-device` property,\nwhich points to a single device in the list at a time:\n\n```html\n<app-media-devices kind=\"audioinput\" selected-device=\"{{microphone}}\">\n</app-media-devices>\n```\n\n### `<app-media-stream>`\n\nOnce you have found a device that you like, you'll need to access a\n`MediaStream` of the input from the device. The `app-media-stream` makes it\neasy to convert a device reference to a `MediaStream`:\n\n```html\n<app-media-stream audio-device=\"[[microphone]]\" stream=\"{{microphoneStream}}\">\n</app-media-stream>\n```\n\nHowever, sometimes you don't know which device you want to use. The Media\nCapture and Streams API allows users to\n[specify constraints](https://w3c.github.io/mediacapture-main/#constrainable-properties)\nrelated to the input device. For example, if you wish to access a camera stream,\nand you would prefer to get the back-facing camera if available, you could do\nsomething like this:\n\n```html\n<app-media-stream\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{backFacingCameraStream}}\">\n</app-media-stream>\n```\n\nYou can use `app-media-stream` to record a device screen.\n\nScreen sharing in Chrome and Firefox has some differences. See\n[this](https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/#why-screen-fails)\npage for more info.\n\nTo capture the screen in Chrome use `{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}`\nvideo constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nNOTE: As of today (April 23th, 2017), screen capturing in Chrome is available only on\nAndroid and requires enabling `chrome://flags#enable-usermedia-screen-capturing` flag.\n\nTo capture the screen in Firefox use `{\"mediaSource\": \"screen\"}` video constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mediaSource\": \"screen\"}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nYou can also use `{\"mediaSource\": \"window\"}` to capture only application window\nand `{\"mediaSource\": \"application\"}` to capture all application windows,\nnot the whole screen.\n\nNOTE: Firefox (before version 52) requires to set `media.getusermedia.screensharing.enabled`\nto `true` and add the web app domain to `media.getusermedia.screensharing.allowed_domains`\nin `about:config`.\n\nIt's easy to create a stream that contains both audio and video tracks as well.\nAny combination of devices and constraints can be used when configuring:\n\n```html\n<app-media-stream\n    audio-device=\"[[microphone]]\"\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{cameraAndMicrophoneStream}}\">\n</app-media-stream>\n```\n\nNOTE: Chrome doesn't support combining screen capture video tracks with audio tracks.\n\n### `<app-media-video>`\n\nSuppose you are planning to build an awesome camera app. At some point, you will\nneed to convert your `MediaStream` instance into video that the user can see, so\nthat she knows what is being recorded. Conveniently, you don't need a special\nelement to make this work. You can actually just use a basic `<video>` element:\n\n```html\n<video src-object=\"[[backFacingCameraStream]]\" autoplay></video>\n```\n\nOnce the `backFacingCameraStream` is available, the `<video>` element will\ndisplay video from the camera. But, without further intervention, the video will\nchange its size to be the pixel dimensions of the incoming video feed. If you\nare building a camera app, you may want a video that scales predictably inside\nof its container. An easy way to get this is to use `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay>\n</app-media-video>\n```\n\nBy default, `<app-media-video>` will automatically scale the video so that it\nis \"full bleed\" relative to the dimensions of the `<app-media-video>` element.\nIt can also be configured to scale the video so that it is contained instead of\ncropped by the boundary of `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay contain>\n</app-media-video>\n```\n\nNote that when using a combined stream of camera and microphone data, you may\nwish to mute the video in order to avoid creating a feedback loop.\n\n### `<app-media-recorder>`\n\nEventually you will want to record actual video and audio from the\n`MediaStream`. This is where the MediaStream Recording API comes in, and there\nis an element to make it nice and declarative called `<app-media-recorder>`.\nIn order to use it, configure the element with an optional duration and bind the\nstream to it:\n\n```html\n<app-media-recorder\n    id=\"recorder\"\n    stream=\"[[cameraAndMicrophoneStream]]\"\n    data=\"{{recordedVideo}}\"\n    duration=\"3000\">\n</app-media-recorder>\n```\n\nWhen you are ready to make a recording, call the `start` method on the element:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  createRecording: function() {\n    this.$.recorder.start();\n  }\n\n  // ....\n});\n</script>\n```\n\nThe `<app-media-recorder>` will start recording from the configured stream and\nautomatically stop after the configured duration. While the recording is taking\nplace, the element will dispatch `app-media-recorder-chunk` events that contain\nindividual data chunks as provided by `MediaRecorder` it the `dataavailable`\nevent. Once the recording is available, it will assign it to the `data` property\n(this will also update the bound `recordedVideo` property in the example above).\n\nIf you don't configure a `duration`, then the recording will continue until you\ncall the `stop` method on the recorder instance.\n\n### `<app-media-image-capture>`\n\nAn emerging standard defines the\n[Image Capture API](https://w3c.github.io/mediacapture-image/), which allows for\nmore fine-grained control of camera settings such as color temperature, white\nbalance, focus and flash. It also allows for direct JPEG capture of the image\nthat appears in a given media device.\n\nThe `<app-media-image-capture>` element offers a declarative strategy for\nconfiguring an `ImageCapture` instance and accessing the photos it takes:\n\n```html\n<app-media-image-capture\n    id=\"imageCapture\"\n    stream=\"[[videoStream]]\"\n    focus-mode=\"single-shot\"\n    red-eye-reduction\n    last-photo=\"{{photo}}\">\n</app-media-image-capture>\n```\n\nWhen you are ready to capture a photo, call the `takePhoto` method:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  takePhoto: function() {\n    // NOTE: This method also returns a promise that resolves the photo.\n    this.$.imageCapture.takePhoto();\n  }\n\n  // ....\n});\n</script>\n```\n\n### `<app-media-audio>`\n\nIf you are building a voice memo app, you may wish to access an audio analyzer\nso that you can visualize microphone input in real time. This can be done with\nthe `<app-media-audio>` element:\n\n```html\n<app-media-audio\n    stream=\"[[microphoneStream]]\"\n    analyser=\"{{microphoneAnalyser}}\">\n</app-media-audio>\n```\n\nWhen the `microphoneStream` becomes available, the `microphoneAnalyser` property\nwill be assigned an instance of a Web Audio\n[`AnalyserNode`](https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode)\nthat corresponds to the audio input from that stream. Any stream with at least\nonce audio track can be used as an input for `<app-media-audio>`.\n\n### `<app-media-waveform>`\n\nThere are many kinds of visualization that might be useful for demonstrating to\nyour users that there is a hot mic on their devices. `<app-media-waveform>` is\na basic SVG visualization that can suit a wide-range of visualization needs. It\nis very easy to use if you have an `AnalyzerNode` instance:\n\n```html\n<app-media-waveform analyser=\"[[microphoneAnalyser]]\">\n</app-media-waveform>\n```\n\nThe analyzer is minimal, but its foreground and background can be themed to\nachieve some level of customized look and feel:\n\n```html\n<style>\n  :host {\n    --app-media-waveform-background-color: red;\n    --app-media-waveform-foreground-color: lightblue;\n  }\n</style>\n```\n","gitHead":"9547fb90c1c746038fe1726e05a758d71d9ed3a8","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.0.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2"},"_nodeVersion":"9.8.0","dependencies":{"@polymer/polymer":"^3.0.0-pre.13","@polymer/iron-resizable-behavior":"^3.0.0-pre.13"},"_hasShrinkwrap":false,"readmeFilename":"README.md","devDependencies":{"image-capture":"^0.3.0","wct-browser-legacy":"^0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.13","@polymer/iron-component-page":"^3.0.0-pre.13","@webcomponents/webcomponentsjs":"^2.0.0-0"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.13_1525215364951_0.9510616595258767","host":"s3://npm-registry-packages"}},"3.0.0-pre.14":{"name":"@polymer/app-media","version":"3.0.0-pre.14","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.14","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"392fb1741fc3d599fdf5188a5919b58ebc01a799","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.14.tgz","fileCount":25,"integrity":"sha512-WB3P5Lsqvhrpd9hpPppxO/71OU8pEbuaJMMHIXiNpz5xXWqe3LRV6mi9A8xM3/5NP6GkHZrDcGChQnG2JYIjMQ==","signatures":[{"sig":"MEQCIBr29Z2lCnZRLkGDyKFk45b9ywhtHvSAr2ytiKfwlof+AiAU/KIKaZ5led/3rMX6hQbkT7VcWE0UgWz8yFYZ77jh4w==","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":111343,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJa6QgcCRA9TVsSAnZWagAAqgYP/AzUkykYlvZ0YEJ5iP2p\nBL90A22hTjGwGCtJmpcoUpOwSfKAgDHOBMFNL8UWq9w3aOMSYsyviRlwk/qe\npYs1B7vilSTMKx6UGSJMwpxEBPL6Kcu1yDBa2kFUIOtxccYJpP2713dNaCRU\nCd8o++mB/CsSJVhmRgV/F5tTAf+ez+QevGsb8/8J46FOL282Iq0oLH8p33ny\nRJVjjHvHeIEemitz+Kqsvo9XllyxRSU3zmpQ1YkENaXg6E7VZvdpmLVIWUpm\nYx+x+8w4YEN9p/d+BQ54038WK8hr4l4yXNmZCKFYiOojR+0shp9tZ250Q3aA\nO8juSyzRkvBdtuWZ12pfMMKKFN51VMbdji3rFMUsHTvR9GTM1Jp8QEDnkbsp\nuHujyHUl5LxqJEfj+mQ/aCf5wGGLG34vqnDKUD6jmnzNBaFdU6XSRujQL6KD\nCjL1FLKXLlnO8lhqyMUdVu+0jkL/IMrER0tBDBpvj5e8C/v/AW4dlxQqXw0o\n4vNZwub4zSi8ZR4ygJDKin3E5ZEsu3zMtaBFADiAXHLqbVZNf0xgdvy9TKgU\nFeFybFsVhKbYkfgQy3kRU/4x7fajeKL9sHiCwbSQ7167TIzOFrAfnBaw1n8B\nEALM0V1J2mNA61McPM+rElWBP/lQftSpvD13W10sqacUc+Bv6UfJH2H+kNOu\nzMeY\r\n=ZdET\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","readme":"## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n### Introduction\n\nModern web browsers support a set of APIs called\n[Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/). These\nAPIs allow you to access inputs such as microphones and cameras, and to a\nlimited extent they also let you record and visualize the data.\n\nBrowsers of yet greater modernity support an API called\n[MediaStream Recording](https://www.w3.org/TR/mediastream-recording/). This\nAPI makes recording and processing the data from these inputs really easy and\nfun.\n\nApp Media is a series of elements that wrap these APIs. The intention is to make\nit easier and more fun to build apps and websites that incorporate video and\naudio recording and visualization, while relying on standardized, highly\nperformant browser APIs.\n\n#### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by\n[WebRTC polyfill](https://github.com/webrtc/adapter) and\n[MediaStream ImageCapture API polyfill](https://github.com/GoogleChromeLabs/imagecapture-polyfill).\nThe following table documents browser support for the elements in this collection with\nthese polyfills in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚧 |      🚧 |    🚧 |    🚫\n\n### How to use\n\nMany apps that access cameras and microphones may wish to start by discovering\nwhat is possible on the current device. A laptop typically has only one camera,\nbut a phone often has one or two. A microphone is often present on devices these\ndays, but it's good to know for sure.\n\nBefore you begin, make sure that you are loading the\n[WebRTC polyfill](https://github.com/webrtc/adapter) where appropriate so that\nthe most up to date versions of the necessary APIs are usable in all of your\ntarget browsers.\n\n### `<app-media-devices>`\n\n`app-media` offers the `app-media-devices` element to assist in looking up the\navailable cameras, microphones and other inputs on the current device. You can\nconfigure it with a string that can be matched against the kind of device you\nwish to look up, and the element will do the rest. Here is an example that\nbinds an array of all available microphone-like devices to a property called\n`microphones`:\n\n```html\n<app-media-devices kind=\"audioinput\" devices=\"{{microphones}}\">\n</app-media-devices>\n```\n\nIn the example, the available devices are filtered to those that have the string\n`'audioinput'` in their `kind` field. It is often convenient to refer to a\nsingle selected device. This can be done using the `selected-device` property,\nwhich points to a single device in the list at a time:\n\n```html\n<app-media-devices kind=\"audioinput\" selected-device=\"{{microphone}}\">\n</app-media-devices>\n```\n\n### `<app-media-stream>`\n\nOnce you have found a device that you like, you'll need to access a\n`MediaStream` of the input from the device. The `app-media-stream` makes it\neasy to convert a device reference to a `MediaStream`:\n\n```html\n<app-media-stream audio-device=\"[[microphone]]\" stream=\"{{microphoneStream}}\">\n</app-media-stream>\n```\n\nHowever, sometimes you don't know which device you want to use. The Media\nCapture and Streams API allows users to\n[specify constraints](https://w3c.github.io/mediacapture-main/#constrainable-properties)\nrelated to the input device. For example, if you wish to access a camera stream,\nand you would prefer to get the back-facing camera if available, you could do\nsomething like this:\n\n```html\n<app-media-stream\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{backFacingCameraStream}}\">\n</app-media-stream>\n```\n\nYou can use `app-media-stream` to record a device screen.\n\nScreen sharing in Chrome and Firefox has some differences. See\n[this](https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/#why-screen-fails)\npage for more info.\n\nTo capture the screen in Chrome use `{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}`\nvideo constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nNOTE: As of today (April 23th, 2017), screen capturing in Chrome is available only on\nAndroid and requires enabling `chrome://flags#enable-usermedia-screen-capturing` flag.\n\nTo capture the screen in Firefox use `{\"mediaSource\": \"screen\"}` video constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mediaSource\": \"screen\"}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nYou can also use `{\"mediaSource\": \"window\"}` to capture only application window\nand `{\"mediaSource\": \"application\"}` to capture all application windows,\nnot the whole screen.\n\nNOTE: Firefox (before version 52) requires to set `media.getusermedia.screensharing.enabled`\nto `true` and add the web app domain to `media.getusermedia.screensharing.allowed_domains`\nin `about:config`.\n\nIt's easy to create a stream that contains both audio and video tracks as well.\nAny combination of devices and constraints can be used when configuring:\n\n```html\n<app-media-stream\n    audio-device=\"[[microphone]]\"\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{cameraAndMicrophoneStream}}\">\n</app-media-stream>\n```\n\nNOTE: Chrome doesn't support combining screen capture video tracks with audio tracks.\n\n### `<app-media-video>`\n\nSuppose you are planning to build an awesome camera app. At some point, you will\nneed to convert your `MediaStream` instance into video that the user can see, so\nthat she knows what is being recorded. Conveniently, you don't need a special\nelement to make this work. You can actually just use a basic `<video>` element:\n\n```html\n<video src-object=\"[[backFacingCameraStream]]\" autoplay></video>\n```\n\nOnce the `backFacingCameraStream` is available, the `<video>` element will\ndisplay video from the camera. But, without further intervention, the video will\nchange its size to be the pixel dimensions of the incoming video feed. If you\nare building a camera app, you may want a video that scales predictably inside\nof its container. An easy way to get this is to use `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay>\n</app-media-video>\n```\n\nBy default, `<app-media-video>` will automatically scale the video so that it\nis \"full bleed\" relative to the dimensions of the `<app-media-video>` element.\nIt can also be configured to scale the video so that it is contained instead of\ncropped by the boundary of `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay contain>\n</app-media-video>\n```\n\nNote that when using a combined stream of camera and microphone data, you may\nwish to mute the video in order to avoid creating a feedback loop.\n\n### `<app-media-recorder>`\n\nEventually you will want to record actual video and audio from the\n`MediaStream`. This is where the MediaStream Recording API comes in, and there\nis an element to make it nice and declarative called `<app-media-recorder>`.\nIn order to use it, configure the element with an optional duration and bind the\nstream to it:\n\n```html\n<app-media-recorder\n    id=\"recorder\"\n    stream=\"[[cameraAndMicrophoneStream]]\"\n    data=\"{{recordedVideo}}\"\n    duration=\"3000\">\n</app-media-recorder>\n```\n\nWhen you are ready to make a recording, call the `start` method on the element:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  createRecording: function() {\n    this.$.recorder.start();\n  }\n\n  // ....\n});\n</script>\n```\n\nThe `<app-media-recorder>` will start recording from the configured stream and\nautomatically stop after the configured duration. While the recording is taking\nplace, the element will dispatch `app-media-recorder-chunk` events that contain\nindividual data chunks as provided by `MediaRecorder` it the `dataavailable`\nevent. Once the recording is available, it will assign it to the `data` property\n(this will also update the bound `recordedVideo` property in the example above).\n\nIf you don't configure a `duration`, then the recording will continue until you\ncall the `stop` method on the recorder instance.\n\n### `<app-media-image-capture>`\n\nAn emerging standard defines the\n[Image Capture API](https://w3c.github.io/mediacapture-image/), which allows for\nmore fine-grained control of camera settings such as color temperature, white\nbalance, focus and flash. It also allows for direct JPEG capture of the image\nthat appears in a given media device.\n\nThe `<app-media-image-capture>` element offers a declarative strategy for\nconfiguring an `ImageCapture` instance and accessing the photos it takes:\n\n```html\n<app-media-image-capture\n    id=\"imageCapture\"\n    stream=\"[[videoStream]]\"\n    focus-mode=\"single-shot\"\n    red-eye-reduction\n    last-photo=\"{{photo}}\">\n</app-media-image-capture>\n```\n\nWhen you are ready to capture a photo, call the `takePhoto` method:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  takePhoto: function() {\n    // NOTE: This method also returns a promise that resolves the photo.\n    this.$.imageCapture.takePhoto();\n  }\n\n  // ....\n});\n</script>\n```\n\n### `<app-media-audio>`\n\nIf you are building a voice memo app, you may wish to access an audio analyzer\nso that you can visualize microphone input in real time. This can be done with\nthe `<app-media-audio>` element:\n\n```html\n<app-media-audio\n    stream=\"[[microphoneStream]]\"\n    analyser=\"{{microphoneAnalyser}}\">\n</app-media-audio>\n```\n\nWhen the `microphoneStream` becomes available, the `microphoneAnalyser` property\nwill be assigned an instance of a Web Audio\n[`AnalyserNode`](https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode)\nthat corresponds to the audio input from that stream. Any stream with at least\nonce audio track can be used as an input for `<app-media-audio>`.\n\n### `<app-media-waveform>`\n\nThere are many kinds of visualization that might be useful for demonstrating to\nyour users that there is a hot mic on their devices. `<app-media-waveform>` is\na basic SVG visualization that can suit a wide-range of visualization needs. It\nis very easy to use if you have an `AnalyzerNode` instance:\n\n```html\n<app-media-waveform analyser=\"[[microphoneAnalyser]]\">\n</app-media-waveform>\n```\n\nThe analyzer is minimal, but its foreground and background can be themed to\nachieve some level of customized look and feel:\n\n```html\n<style>\n  :host {\n    --app-media-waveform-background-color: red;\n    --app-media-waveform-foreground-color: lightblue;\n  }\n</style>\n```\n","gitHead":"413d9a2d2fa87f001001a4400a13d8cb4624fec1","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.0.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2"},"_nodeVersion":"9.8.0","dependencies":{"@polymer/polymer":"^3.0.0-pre.13","@polymer/iron-resizable-behavior":"^3.0.0-pre.14"},"_hasShrinkwrap":false,"readmeFilename":"README.md","devDependencies":{"image-capture":"^0.3.0","wct-browser-legacy":"^0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.14","@polymer/iron-component-page":"^3.0.0-pre.14","@webcomponents/webcomponentsjs":"^2.0.0-0"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.14_1525221403759_0.8970029081903295","host":"s3://npm-registry-packages"}},"3.0.0-pre.15":{"name":"@polymer/app-media","version":"3.0.0-pre.15","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.15","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"832a961356dbd65e6bd33878ab32b994485a04a1","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.15.tgz","fileCount":26,"integrity":"sha512-TkMvsiGf7ARb0xBpNDdDNuApmvrT8B8QhMZMm7mXkNusvhf/MsAkNywS51a7Axa8FH62jZuubcwX7QE2UFUDSA==","signatures":[{"sig":"MEUCIQCbsG698wr79MUZNwCqLJPcbOIQA7DVuouNtggILXQuswIgHfE+GyxZGzgP8/Lz4B1+kXkZVmWH+quGclDwC013yvA=","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":112577,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJa6iVXCRA9TVsSAnZWagAAWTEP/2owTdbIRueAjC5XHqsT\npNmN9/FdLtT4PWj+1tAdcSWZ4dyWT9Y9NDSG27bUtNu1Fica4OSUjtwUVieq\n6MBsvG70IqTqGZJUB10AOdsO2+Awtso/6J96CL3VSGbksY3TwwtOuX6QZGrA\nlRmqdPTizI3hKzhixpCUQM9s1A62zdTsEipVDiCnHgmK6rwUOacgx656mb4r\nyEhvv6VNSCZt7c7+9mg2gJgIwmBCogQVljFjSMPgSMWD4a4Ormy25851QESh\nVjMngCMbkGNKkX4a1suU/xtTVyaYBL38LcNbiBIfDzktUkxsJqN7BdzEV4be\nKWJpBgRXKGAndVl4fr0ucdRDWcekUs3ClZ45CGxZ44JEnobgDqpXL6rxCrkG\nT4fHjXlb8Cn0wScQGP0Nu/t/zkY+q+oW8MjC8CFTnmpmpYZgER0UvUriab6F\nu6mhOy0TR5vHCxWGE+1blL0CGlJY/LbMyDoW3RwpanJk/CTfgq1G211WUo3z\naXv2twCCCh8RcceGF5KSQKJ4UYMylVk7E0hKkcSa2EElpnety3SB+h0NqP7s\nKc4xcMe3jLhu9ekKn4ZqOpxgTfOfoLg01da1jyOuQMouzFgU3UyfEkqaWy3K\nKO1LIB9n6EAz7Vv6fzSn7IHNyF9QzuqGUjMo7qEz5+L4abHokO3umUgu390U\nR5mR\r\n=H0qA\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","readme":"## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n### Introduction\n\nModern web browsers support a set of APIs called\n[Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/). These\nAPIs allow you to access inputs such as microphones and cameras, and to a\nlimited extent they also let you record and visualize the data.\n\nBrowsers of yet greater modernity support an API called\n[MediaStream Recording](https://www.w3.org/TR/mediastream-recording/). This\nAPI makes recording and processing the data from these inputs really easy and\nfun.\n\nApp Media is a series of elements that wrap these APIs. The intention is to make\nit easier and more fun to build apps and websites that incorporate video and\naudio recording and visualization, while relying on standardized, highly\nperformant browser APIs.\n\n#### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by\n[WebRTC polyfill](https://github.com/webrtc/adapter) and\n[MediaStream ImageCapture API polyfill](https://github.com/GoogleChromeLabs/imagecapture-polyfill).\nThe following table documents browser support for the elements in this collection with\nthese polyfills in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚧 |      🚧 |    🚧 |    🚫\n\n### How to use\n\nMany apps that access cameras and microphones may wish to start by discovering\nwhat is possible on the current device. A laptop typically has only one camera,\nbut a phone often has one or two. A microphone is often present on devices these\ndays, but it's good to know for sure.\n\nBefore you begin, make sure that you are loading the\n[WebRTC polyfill](https://github.com/webrtc/adapter) where appropriate so that\nthe most up to date versions of the necessary APIs are usable in all of your\ntarget browsers.\n\n### `<app-media-devices>`\n\n`app-media` offers the `app-media-devices` element to assist in looking up the\navailable cameras, microphones and other inputs on the current device. You can\nconfigure it with a string that can be matched against the kind of device you\nwish to look up, and the element will do the rest. Here is an example that\nbinds an array of all available microphone-like devices to a property called\n`microphones`:\n\n```html\n<app-media-devices kind=\"audioinput\" devices=\"{{microphones}}\">\n</app-media-devices>\n```\n\nIn the example, the available devices are filtered to those that have the string\n`'audioinput'` in their `kind` field. It is often convenient to refer to a\nsingle selected device. This can be done using the `selected-device` property,\nwhich points to a single device in the list at a time:\n\n```html\n<app-media-devices kind=\"audioinput\" selected-device=\"{{microphone}}\">\n</app-media-devices>\n```\n\n### `<app-media-stream>`\n\nOnce you have found a device that you like, you'll need to access a\n`MediaStream` of the input from the device. The `app-media-stream` makes it\neasy to convert a device reference to a `MediaStream`:\n\n```html\n<app-media-stream audio-device=\"[[microphone]]\" stream=\"{{microphoneStream}}\">\n</app-media-stream>\n```\n\nHowever, sometimes you don't know which device you want to use. The Media\nCapture and Streams API allows users to\n[specify constraints](https://w3c.github.io/mediacapture-main/#constrainable-properties)\nrelated to the input device. For example, if you wish to access a camera stream,\nand you would prefer to get the back-facing camera if available, you could do\nsomething like this:\n\n```html\n<app-media-stream\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{backFacingCameraStream}}\">\n</app-media-stream>\n```\n\nYou can use `app-media-stream` to record a device screen.\n\nScreen sharing in Chrome and Firefox has some differences. See\n[this](https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/#why-screen-fails)\npage for more info.\n\nTo capture the screen in Chrome use `{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}`\nvideo constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nNOTE: As of today (April 23th, 2017), screen capturing in Chrome is available only on\nAndroid and requires enabling `chrome://flags#enable-usermedia-screen-capturing` flag.\n\nTo capture the screen in Firefox use `{\"mediaSource\": \"screen\"}` video constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mediaSource\": \"screen\"}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nYou can also use `{\"mediaSource\": \"window\"}` to capture only application window\nand `{\"mediaSource\": \"application\"}` to capture all application windows,\nnot the whole screen.\n\nNOTE: Firefox (before version 52) requires to set `media.getusermedia.screensharing.enabled`\nto `true` and add the web app domain to `media.getusermedia.screensharing.allowed_domains`\nin `about:config`.\n\nIt's easy to create a stream that contains both audio and video tracks as well.\nAny combination of devices and constraints can be used when configuring:\n\n```html\n<app-media-stream\n    audio-device=\"[[microphone]]\"\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{cameraAndMicrophoneStream}}\">\n</app-media-stream>\n```\n\nNOTE: Chrome doesn't support combining screen capture video tracks with audio tracks.\n\n### `<app-media-video>`\n\nSuppose you are planning to build an awesome camera app. At some point, you will\nneed to convert your `MediaStream` instance into video that the user can see, so\nthat she knows what is being recorded. Conveniently, you don't need a special\nelement to make this work. You can actually just use a basic `<video>` element:\n\n```html\n<video src-object=\"[[backFacingCameraStream]]\" autoplay></video>\n```\n\nOnce the `backFacingCameraStream` is available, the `<video>` element will\ndisplay video from the camera. But, without further intervention, the video will\nchange its size to be the pixel dimensions of the incoming video feed. If you\nare building a camera app, you may want a video that scales predictably inside\nof its container. An easy way to get this is to use `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay>\n</app-media-video>\n```\n\nBy default, `<app-media-video>` will automatically scale the video so that it\nis \"full bleed\" relative to the dimensions of the `<app-media-video>` element.\nIt can also be configured to scale the video so that it is contained instead of\ncropped by the boundary of `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay contain>\n</app-media-video>\n```\n\nNote that when using a combined stream of camera and microphone data, you may\nwish to mute the video in order to avoid creating a feedback loop.\n\n### `<app-media-recorder>`\n\nEventually you will want to record actual video and audio from the\n`MediaStream`. This is where the MediaStream Recording API comes in, and there\nis an element to make it nice and declarative called `<app-media-recorder>`.\nIn order to use it, configure the element with an optional duration and bind the\nstream to it:\n\n```html\n<app-media-recorder\n    id=\"recorder\"\n    stream=\"[[cameraAndMicrophoneStream]]\"\n    data=\"{{recordedVideo}}\"\n    duration=\"3000\">\n</app-media-recorder>\n```\n\nWhen you are ready to make a recording, call the `start` method on the element:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  createRecording: function() {\n    this.$.recorder.start();\n  }\n\n  // ....\n});\n</script>\n```\n\nThe `<app-media-recorder>` will start recording from the configured stream and\nautomatically stop after the configured duration. While the recording is taking\nplace, the element will dispatch `app-media-recorder-chunk` events that contain\nindividual data chunks as provided by `MediaRecorder` it the `dataavailable`\nevent. Once the recording is available, it will assign it to the `data` property\n(this will also update the bound `recordedVideo` property in the example above).\n\nIf you don't configure a `duration`, then the recording will continue until you\ncall the `stop` method on the recorder instance.\n\n### `<app-media-image-capture>`\n\nAn emerging standard defines the\n[Image Capture API](https://w3c.github.io/mediacapture-image/), which allows for\nmore fine-grained control of camera settings such as color temperature, white\nbalance, focus and flash. It also allows for direct JPEG capture of the image\nthat appears in a given media device.\n\nThe `<app-media-image-capture>` element offers a declarative strategy for\nconfiguring an `ImageCapture` instance and accessing the photos it takes:\n\n```html\n<app-media-image-capture\n    id=\"imageCapture\"\n    stream=\"[[videoStream]]\"\n    focus-mode=\"single-shot\"\n    red-eye-reduction\n    last-photo=\"{{photo}}\">\n</app-media-image-capture>\n```\n\nWhen you are ready to capture a photo, call the `takePhoto` method:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  takePhoto: function() {\n    // NOTE: This method also returns a promise that resolves the photo.\n    this.$.imageCapture.takePhoto();\n  }\n\n  // ....\n});\n</script>\n```\n\n### `<app-media-audio>`\n\nIf you are building a voice memo app, you may wish to access an audio analyzer\nso that you can visualize microphone input in real time. This can be done with\nthe `<app-media-audio>` element:\n\n```html\n<app-media-audio\n    stream=\"[[microphoneStream]]\"\n    analyser=\"{{microphoneAnalyser}}\">\n</app-media-audio>\n```\n\nWhen the `microphoneStream` becomes available, the `microphoneAnalyser` property\nwill be assigned an instance of a Web Audio\n[`AnalyserNode`](https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode)\nthat corresponds to the audio input from that stream. Any stream with at least\nonce audio track can be used as an input for `<app-media-audio>`.\n\n### `<app-media-waveform>`\n\nThere are many kinds of visualization that might be useful for demonstrating to\nyour users that there is a hot mic on their devices. `<app-media-waveform>` is\na basic SVG visualization that can suit a wide-range of visualization needs. It\nis very easy to use if you have an `AnalyzerNode` instance:\n\n```html\n<app-media-waveform analyser=\"[[microphoneAnalyser]]\">\n</app-media-waveform>\n```\n\nThe analyzer is minimal, but its foreground and background can be themed to\nachieve some level of customized look and feel:\n\n```html\n<style>\n  :host {\n    --app-media-waveform-background-color: red;\n    --app-media-waveform-foreground-color: lightblue;\n  }\n</style>\n```\n","gitHead":"d41f84e3e99f4ad14bddcc0c89e1e7c053061d31","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.0.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2","@webcomponents/webcomponentsjs":"2.0.0-beta.2"},"_nodeVersion":"9.8.0","dependencies":{"@polymer/polymer":"^3.0.0-pre.13","@polymer/iron-resizable-behavior":"^3.0.0-pre.15"},"_hasShrinkwrap":false,"readmeFilename":"README.md","devDependencies":{"image-capture":"^0.3.0","wct-browser-legacy":"^0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.15","@polymer/iron-component-page":"^3.0.0-pre.15","@webcomponents/webcomponentsjs":"^2.0.0-beta.2"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.15_1525294422513_0.0589600341098282","host":"s3://npm-registry-packages"}},"3.0.0-pre.16":{"name":"@polymer/app-media","version":"3.0.0-pre.16","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.16","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"de1737f20034998cea858c8af978c3fdce7f4ca1","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.16.tgz","fileCount":26,"integrity":"sha512-HWpR0lS6zDj7GUxfMn+wSA3hTnzJg+9KHEtKSYNvQEcQbDFyJhstQmyCgRqb4hnXOPDuUyuFNOVEW3VdB+UX4Q==","signatures":[{"sig":"MEQCIFO5vxPikBHmLu1By+ua6EEom9Syt9t3cvlwRz8z9QRxAiAQy9U7UyaxJT7KRLOzmxjO8D46T4DNz4ZtFoZiwSDcuA==","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":112577,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJa7L/JCRA9TVsSAnZWagAAL+oP+wZ4yn1cdAgi9lJWthLP\n9Ywk+4dAhhZP7Srk/dodaY3zFr0yXykGdQoF2ICF+B/CUkkDf9ZEO6Wd1w4K\ny3dLL12w0T30Q7grMxR+85mUBZn7f0Yv5sy0CCxxzoxT0jAMtL4V8Hd8uChs\niD3H/clkdsAIIDFIlqgTTOiWauYmZZpuvGySiyslsjzs55qyKKP5k0ariq2+\nw44V2yG+hpyW0RKzvR7J0HGXO/3KfBybzEIaGNziRfezWOKAUMripmmqy5tx\nEsts5NQJ/3MWJj+5sQ11Rc0HylAFQDWnC+XLcUiyC4RuTnBsfuaHNgazrqTG\n8AHdSMBSc4K2raLT5o6r8/5Q5xl1uuzQzFyZke+ZBPgVQJMEtz/erkmyQkoi\nju3Ud6rzqmp0UThJ1+F0+UsKY2Fk1bOozU2ukWy8vs09W91bSc0N7PBIY2h+\nspBMuHuXP0yrm/8kbO6D0X9peiSfESyPZs1vflOkzRLb+4m9mBisiHaH1BgU\nJxHlIIrDErh+Ca+V6B6mmbz3Uvya8IMWmWk+RiunU/xo5a53eAMQDPuDINJt\n6S355glYLKyR5OhLl1XeXHIHGZGa9I9K/FxgEhQ5MvFlKabHyfPLzOR1T3Nl\nsy5PZMcKGk1AgkNDfPUhz0KlFQ9TJ/0KD/oO2kaOgF+hUTIykpnTeXhWCTrr\nMjUs\r\n=AIMy\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","readme":"## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n### Introduction\n\nModern web browsers support a set of APIs called\n[Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/). These\nAPIs allow you to access inputs such as microphones and cameras, and to a\nlimited extent they also let you record and visualize the data.\n\nBrowsers of yet greater modernity support an API called\n[MediaStream Recording](https://www.w3.org/TR/mediastream-recording/). This\nAPI makes recording and processing the data from these inputs really easy and\nfun.\n\nApp Media is a series of elements that wrap these APIs. The intention is to make\nit easier and more fun to build apps and websites that incorporate video and\naudio recording and visualization, while relying on standardized, highly\nperformant browser APIs.\n\n#### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by\n[WebRTC polyfill](https://github.com/webrtc/adapter) and\n[MediaStream ImageCapture API polyfill](https://github.com/GoogleChromeLabs/imagecapture-polyfill).\nThe following table documents browser support for the elements in this collection with\nthese polyfills in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚧 |      🚧 |    🚧 |    🚫\n\n### How to use\n\nMany apps that access cameras and microphones may wish to start by discovering\nwhat is possible on the current device. A laptop typically has only one camera,\nbut a phone often has one or two. A microphone is often present on devices these\ndays, but it's good to know for sure.\n\nBefore you begin, make sure that you are loading the\n[WebRTC polyfill](https://github.com/webrtc/adapter) where appropriate so that\nthe most up to date versions of the necessary APIs are usable in all of your\ntarget browsers.\n\n### `<app-media-devices>`\n\n`app-media` offers the `app-media-devices` element to assist in looking up the\navailable cameras, microphones and other inputs on the current device. You can\nconfigure it with a string that can be matched against the kind of device you\nwish to look up, and the element will do the rest. Here is an example that\nbinds an array of all available microphone-like devices to a property called\n`microphones`:\n\n```html\n<app-media-devices kind=\"audioinput\" devices=\"{{microphones}}\">\n</app-media-devices>\n```\n\nIn the example, the available devices are filtered to those that have the string\n`'audioinput'` in their `kind` field. It is often convenient to refer to a\nsingle selected device. This can be done using the `selected-device` property,\nwhich points to a single device in the list at a time:\n\n```html\n<app-media-devices kind=\"audioinput\" selected-device=\"{{microphone}}\">\n</app-media-devices>\n```\n\n### `<app-media-stream>`\n\nOnce you have found a device that you like, you'll need to access a\n`MediaStream` of the input from the device. The `app-media-stream` makes it\neasy to convert a device reference to a `MediaStream`:\n\n```html\n<app-media-stream audio-device=\"[[microphone]]\" stream=\"{{microphoneStream}}\">\n</app-media-stream>\n```\n\nHowever, sometimes you don't know which device you want to use. The Media\nCapture and Streams API allows users to\n[specify constraints](https://w3c.github.io/mediacapture-main/#constrainable-properties)\nrelated to the input device. For example, if you wish to access a camera stream,\nand you would prefer to get the back-facing camera if available, you could do\nsomething like this:\n\n```html\n<app-media-stream\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{backFacingCameraStream}}\">\n</app-media-stream>\n```\n\nYou can use `app-media-stream` to record a device screen.\n\nScreen sharing in Chrome and Firefox has some differences. See\n[this](https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/#why-screen-fails)\npage for more info.\n\nTo capture the screen in Chrome use `{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}`\nvideo constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nNOTE: As of today (April 23th, 2017), screen capturing in Chrome is available only on\nAndroid and requires enabling `chrome://flags#enable-usermedia-screen-capturing` flag.\n\nTo capture the screen in Firefox use `{\"mediaSource\": \"screen\"}` video constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mediaSource\": \"screen\"}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nYou can also use `{\"mediaSource\": \"window\"}` to capture only application window\nand `{\"mediaSource\": \"application\"}` to capture all application windows,\nnot the whole screen.\n\nNOTE: Firefox (before version 52) requires to set `media.getusermedia.screensharing.enabled`\nto `true` and add the web app domain to `media.getusermedia.screensharing.allowed_domains`\nin `about:config`.\n\nIt's easy to create a stream that contains both audio and video tracks as well.\nAny combination of devices and constraints can be used when configuring:\n\n```html\n<app-media-stream\n    audio-device=\"[[microphone]]\"\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{cameraAndMicrophoneStream}}\">\n</app-media-stream>\n```\n\nNOTE: Chrome doesn't support combining screen capture video tracks with audio tracks.\n\n### `<app-media-video>`\n\nSuppose you are planning to build an awesome camera app. At some point, you will\nneed to convert your `MediaStream` instance into video that the user can see, so\nthat she knows what is being recorded. Conveniently, you don't need a special\nelement to make this work. You can actually just use a basic `<video>` element:\n\n```html\n<video src-object=\"[[backFacingCameraStream]]\" autoplay></video>\n```\n\nOnce the `backFacingCameraStream` is available, the `<video>` element will\ndisplay video from the camera. But, without further intervention, the video will\nchange its size to be the pixel dimensions of the incoming video feed. If you\nare building a camera app, you may want a video that scales predictably inside\nof its container. An easy way to get this is to use `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay>\n</app-media-video>\n```\n\nBy default, `<app-media-video>` will automatically scale the video so that it\nis \"full bleed\" relative to the dimensions of the `<app-media-video>` element.\nIt can also be configured to scale the video so that it is contained instead of\ncropped by the boundary of `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay contain>\n</app-media-video>\n```\n\nNote that when using a combined stream of camera and microphone data, you may\nwish to mute the video in order to avoid creating a feedback loop.\n\n### `<app-media-recorder>`\n\nEventually you will want to record actual video and audio from the\n`MediaStream`. This is where the MediaStream Recording API comes in, and there\nis an element to make it nice and declarative called `<app-media-recorder>`.\nIn order to use it, configure the element with an optional duration and bind the\nstream to it:\n\n```html\n<app-media-recorder\n    id=\"recorder\"\n    stream=\"[[cameraAndMicrophoneStream]]\"\n    data=\"{{recordedVideo}}\"\n    duration=\"3000\">\n</app-media-recorder>\n```\n\nWhen you are ready to make a recording, call the `start` method on the element:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  createRecording: function() {\n    this.$.recorder.start();\n  }\n\n  // ....\n});\n</script>\n```\n\nThe `<app-media-recorder>` will start recording from the configured stream and\nautomatically stop after the configured duration. While the recording is taking\nplace, the element will dispatch `app-media-recorder-chunk` events that contain\nindividual data chunks as provided by `MediaRecorder` it the `dataavailable`\nevent. Once the recording is available, it will assign it to the `data` property\n(this will also update the bound `recordedVideo` property in the example above).\n\nIf you don't configure a `duration`, then the recording will continue until you\ncall the `stop` method on the recorder instance.\n\n### `<app-media-image-capture>`\n\nAn emerging standard defines the\n[Image Capture API](https://w3c.github.io/mediacapture-image/), which allows for\nmore fine-grained control of camera settings such as color temperature, white\nbalance, focus and flash. It also allows for direct JPEG capture of the image\nthat appears in a given media device.\n\nThe `<app-media-image-capture>` element offers a declarative strategy for\nconfiguring an `ImageCapture` instance and accessing the photos it takes:\n\n```html\n<app-media-image-capture\n    id=\"imageCapture\"\n    stream=\"[[videoStream]]\"\n    focus-mode=\"single-shot\"\n    red-eye-reduction\n    last-photo=\"{{photo}}\">\n</app-media-image-capture>\n```\n\nWhen you are ready to capture a photo, call the `takePhoto` method:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  takePhoto: function() {\n    // NOTE: This method also returns a promise that resolves the photo.\n    this.$.imageCapture.takePhoto();\n  }\n\n  // ....\n});\n</script>\n```\n\n### `<app-media-audio>`\n\nIf you are building a voice memo app, you may wish to access an audio analyzer\nso that you can visualize microphone input in real time. This can be done with\nthe `<app-media-audio>` element:\n\n```html\n<app-media-audio\n    stream=\"[[microphoneStream]]\"\n    analyser=\"{{microphoneAnalyser}}\">\n</app-media-audio>\n```\n\nWhen the `microphoneStream` becomes available, the `microphoneAnalyser` property\nwill be assigned an instance of a Web Audio\n[`AnalyserNode`](https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode)\nthat corresponds to the audio input from that stream. Any stream with at least\nonce audio track can be used as an input for `<app-media-audio>`.\n\n### `<app-media-waveform>`\n\nThere are many kinds of visualization that might be useful for demonstrating to\nyour users that there is a hot mic on their devices. `<app-media-waveform>` is\na basic SVG visualization that can suit a wide-range of visualization needs. It\nis very easy to use if you have an `AnalyzerNode` instance:\n\n```html\n<app-media-waveform analyser=\"[[microphoneAnalyser]]\">\n</app-media-waveform>\n```\n\nThe analyzer is minimal, but its foreground and background can be themed to\nachieve some level of customized look and feel:\n\n```html\n<style>\n  :host {\n    --app-media-waveform-background-color: red;\n    --app-media-waveform-foreground-color: lightblue;\n  }\n</style>\n```\n","gitHead":"a344ea11cb469ada6c4fa41247e4a70a750fe2b7","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.0.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2","@webcomponents/webcomponentsjs":"2.0.0-beta.2"},"_nodeVersion":"9.8.0","dependencies":{"@polymer/polymer":"^3.0.0-pre.13","@polymer/iron-resizable-behavior":"^3.0.0-pre.16"},"_hasShrinkwrap":false,"readmeFilename":"README.md","devDependencies":{"image-capture":"^0.3.0","wct-browser-legacy":"^0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.16","@polymer/iron-component-page":"^3.0.0-pre.16","@webcomponents/webcomponentsjs":"^2.0.0-beta.2"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.16_1525465032668_0.03384898018421745","host":"s3://npm-registry-packages"}},"3.0.0-pre.17":{"name":"@polymer/app-media","version":"3.0.0-pre.17","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.17","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"2d1dff33eafb8026ae3effabf087c8e0cad2f4a8","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.17.tgz","fileCount":26,"integrity":"sha512-l1A9Mu0qTRhrOFF1sCuiC7EsoMLNMqZIzuJ78hqnnmX/S5EknW9OC951DgSVR1Lp+iTfM1mGi8saCfuXZc6AAA==","signatures":[{"sig":"MEUCIG7KmZ18qJy2Bmzygulnk+pmpqJWrCmbCiXXGtLV2NQ7AiEA7btJM1iaM9OL74H7m8sFqzZKpF1DeMQc6KSO7CdszIs=","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":113210,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJa8TCdCRA9TVsSAnZWagAAwJQP/2CsZ6IaQW86tzsbk834\nmbIvbQ+7h60d+knFMKcsSBThUpaTsRXwB3wVuSU/kzwONP6Rx8NKE5jh6DDd\nOJcYJOLgZCrR8JKdaQHokr3d6T7zWriX8xJMbP2ir8fjlohyzfPhJ+VhdhLC\n0aXgUEAJgYx5xVNuNa/EjS/FfSxSWQ+0r3T6jPQOFc1l0P20l9HRUe+zP4lX\nMJf+KfZ8MlkYX/zcSSoAFcisGoo6RDNyPhCdV7zJIvWQmq8PPQiMNKubivAF\nK4ctquhICNavzDN7AH5O+nFM4awN02ew5Me/DygDEGz+UgTD3yeOE9Xx901k\n2ZGGW3Ot1GYi9QyIjmB5yXp4deybwk8j038WuXw9FX796ebbUVMPE9iDTnk5\nMtBwHMO0FQgC7pyDAHlX2KRFTBUZfTBb64hBm3JI9E2koUpmoCM7alxIlaBX\nuNpLbpnIVR1l0fZXvNqfz1g1oBqjm1TiQRD2gPZoZ9wOvaH8uqqV7hSTr9KN\nxe48C5dkFZppiq8S54UcXS0Ok+7ygLmdfurDR7WrkrNHOUpGWIWCQLfiLc27\nhCrV6FTvPQEFuyc4WQQtrYRbUXUo626v+jMICQtn3sbZCPw+oNJnBgWcisdH\n/+99HJFF+lvKptpuTErKOttB2FuwJst9mbhkFXRsw6NdZFVDN/QI8crqDjPm\nYET+\r\n=UYTm\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","readme":"## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n### Introduction\n\nModern web browsers support a set of APIs called\n[Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/). These\nAPIs allow you to access inputs such as microphones and cameras, and to a\nlimited extent they also let you record and visualize the data.\n\nBrowsers of yet greater modernity support an API called\n[MediaStream Recording](https://www.w3.org/TR/mediastream-recording/). This\nAPI makes recording and processing the data from these inputs really easy and\nfun.\n\nApp Media is a series of elements that wrap these APIs. The intention is to make\nit easier and more fun to build apps and websites that incorporate video and\naudio recording and visualization, while relying on standardized, highly\nperformant browser APIs.\n\n#### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by\n[WebRTC polyfill](https://github.com/webrtc/adapter) and\n[MediaStream ImageCapture API polyfill](https://github.com/GoogleChromeLabs/imagecapture-polyfill).\nThe following table documents browser support for the elements in this collection with\nthese polyfills in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚧 |      🚧 |    🚧 |    🚫\n\n### How to use\n\nMany apps that access cameras and microphones may wish to start by discovering\nwhat is possible on the current device. A laptop typically has only one camera,\nbut a phone often has one or two. A microphone is often present on devices these\ndays, but it's good to know for sure.\n\nBefore you begin, make sure that you are loading the\n[WebRTC polyfill](https://github.com/webrtc/adapter) where appropriate so that\nthe most up to date versions of the necessary APIs are usable in all of your\ntarget browsers.\n\n### `<app-media-devices>`\n\n`app-media` offers the `app-media-devices` element to assist in looking up the\navailable cameras, microphones and other inputs on the current device. You can\nconfigure it with a string that can be matched against the kind of device you\nwish to look up, and the element will do the rest. Here is an example that\nbinds an array of all available microphone-like devices to a property called\n`microphones`:\n\n```html\n<app-media-devices kind=\"audioinput\" devices=\"{{microphones}}\">\n</app-media-devices>\n```\n\nIn the example, the available devices are filtered to those that have the string\n`'audioinput'` in their `kind` field. It is often convenient to refer to a\nsingle selected device. This can be done using the `selected-device` property,\nwhich points to a single device in the list at a time:\n\n```html\n<app-media-devices kind=\"audioinput\" selected-device=\"{{microphone}}\">\n</app-media-devices>\n```\n\n### `<app-media-stream>`\n\nOnce you have found a device that you like, you'll need to access a\n`MediaStream` of the input from the device. The `app-media-stream` makes it\neasy to convert a device reference to a `MediaStream`:\n\n```html\n<app-media-stream audio-device=\"[[microphone]]\" stream=\"{{microphoneStream}}\">\n</app-media-stream>\n```\n\nHowever, sometimes you don't know which device you want to use. The Media\nCapture and Streams API allows users to\n[specify constraints](https://w3c.github.io/mediacapture-main/#constrainable-properties)\nrelated to the input device. For example, if you wish to access a camera stream,\nand you would prefer to get the back-facing camera if available, you could do\nsomething like this:\n\n```html\n<app-media-stream\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{backFacingCameraStream}}\">\n</app-media-stream>\n```\n\nYou can use `app-media-stream` to record a device screen.\n\nScreen sharing in Chrome and Firefox has some differences. See\n[this](https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/#why-screen-fails)\npage for more info.\n\nTo capture the screen in Chrome use `{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}`\nvideo constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nNOTE: As of today (April 23th, 2017), screen capturing in Chrome is available only on\nAndroid and requires enabling `chrome://flags#enable-usermedia-screen-capturing` flag.\n\nTo capture the screen in Firefox use `{\"mediaSource\": \"screen\"}` video constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mediaSource\": \"screen\"}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nYou can also use `{\"mediaSource\": \"window\"}` to capture only application window\nand `{\"mediaSource\": \"application\"}` to capture all application windows,\nnot the whole screen.\n\nNOTE: Firefox (before version 52) requires to set `media.getusermedia.screensharing.enabled`\nto `true` and add the web app domain to `media.getusermedia.screensharing.allowed_domains`\nin `about:config`.\n\nIt's easy to create a stream that contains both audio and video tracks as well.\nAny combination of devices and constraints can be used when configuring:\n\n```html\n<app-media-stream\n    audio-device=\"[[microphone]]\"\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{cameraAndMicrophoneStream}}\">\n</app-media-stream>\n```\n\nNOTE: Chrome doesn't support combining screen capture video tracks with audio tracks.\n\n### `<app-media-video>`\n\nSuppose you are planning to build an awesome camera app. At some point, you will\nneed to convert your `MediaStream` instance into video that the user can see, so\nthat she knows what is being recorded. Conveniently, you don't need a special\nelement to make this work. You can actually just use a basic `<video>` element:\n\n```html\n<video src-object=\"[[backFacingCameraStream]]\" autoplay></video>\n```\n\nOnce the `backFacingCameraStream` is available, the `<video>` element will\ndisplay video from the camera. But, without further intervention, the video will\nchange its size to be the pixel dimensions of the incoming video feed. If you\nare building a camera app, you may want a video that scales predictably inside\nof its container. An easy way to get this is to use `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay>\n</app-media-video>\n```\n\nBy default, `<app-media-video>` will automatically scale the video so that it\nis \"full bleed\" relative to the dimensions of the `<app-media-video>` element.\nIt can also be configured to scale the video so that it is contained instead of\ncropped by the boundary of `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay contain>\n</app-media-video>\n```\n\nNote that when using a combined stream of camera and microphone data, you may\nwish to mute the video in order to avoid creating a feedback loop.\n\n### `<app-media-recorder>`\n\nEventually you will want to record actual video and audio from the\n`MediaStream`. This is where the MediaStream Recording API comes in, and there\nis an element to make it nice and declarative called `<app-media-recorder>`.\nIn order to use it, configure the element with an optional duration and bind the\nstream to it:\n\n```html\n<app-media-recorder\n    id=\"recorder\"\n    stream=\"[[cameraAndMicrophoneStream]]\"\n    data=\"{{recordedVideo}}\"\n    duration=\"3000\">\n</app-media-recorder>\n```\n\nWhen you are ready to make a recording, call the `start` method on the element:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  createRecording: function() {\n    this.$.recorder.start();\n  }\n\n  // ....\n});\n</script>\n```\n\nThe `<app-media-recorder>` will start recording from the configured stream and\nautomatically stop after the configured duration. While the recording is taking\nplace, the element will dispatch `app-media-recorder-chunk` events that contain\nindividual data chunks as provided by `MediaRecorder` it the `dataavailable`\nevent. Once the recording is available, it will assign it to the `data` property\n(this will also update the bound `recordedVideo` property in the example above).\n\nIf you don't configure a `duration`, then the recording will continue until you\ncall the `stop` method on the recorder instance.\n\n### `<app-media-image-capture>`\n\nAn emerging standard defines the\n[Image Capture API](https://w3c.github.io/mediacapture-image/), which allows for\nmore fine-grained control of camera settings such as color temperature, white\nbalance, focus and flash. It also allows for direct JPEG capture of the image\nthat appears in a given media device.\n\nThe `<app-media-image-capture>` element offers a declarative strategy for\nconfiguring an `ImageCapture` instance and accessing the photos it takes:\n\n```html\n<app-media-image-capture\n    id=\"imageCapture\"\n    stream=\"[[videoStream]]\"\n    focus-mode=\"single-shot\"\n    red-eye-reduction\n    last-photo=\"{{photo}}\">\n</app-media-image-capture>\n```\n\nWhen you are ready to capture a photo, call the `takePhoto` method:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  takePhoto: function() {\n    // NOTE: This method also returns a promise that resolves the photo.\n    this.$.imageCapture.takePhoto();\n  }\n\n  // ....\n});\n</script>\n```\n\n### `<app-media-audio>`\n\nIf you are building a voice memo app, you may wish to access an audio analyzer\nso that you can visualize microphone input in real time. This can be done with\nthe `<app-media-audio>` element:\n\n```html\n<app-media-audio\n    stream=\"[[microphoneStream]]\"\n    analyser=\"{{microphoneAnalyser}}\">\n</app-media-audio>\n```\n\nWhen the `microphoneStream` becomes available, the `microphoneAnalyser` property\nwill be assigned an instance of a Web Audio\n[`AnalyserNode`](https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode)\nthat corresponds to the audio input from that stream. Any stream with at least\nonce audio track can be used as an input for `<app-media-audio>`.\n\n### `<app-media-waveform>`\n\nThere are many kinds of visualization that might be useful for demonstrating to\nyour users that there is a hot mic on their devices. `<app-media-waveform>` is\na basic SVG visualization that can suit a wide-range of visualization needs. It\nis very easy to use if you have an `AnalyzerNode` instance:\n\n```html\n<app-media-waveform analyser=\"[[microphoneAnalyser]]\">\n</app-media-waveform>\n```\n\nThe analyzer is minimal, but its foreground and background can be themed to\nachieve some level of customized look and feel:\n\n```html\n<style>\n  :host {\n    --app-media-waveform-background-color: red;\n    --app-media-waveform-foreground-color: lightblue;\n  }\n</style>\n```\n","gitHead":"a19d230df2a651d254f7ce42750afcd30dde3602","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.0.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2","@webcomponents/webcomponentsjs":"2.0.0-beta.2"},"_nodeVersion":"9.8.0","dependencies":{"@polymer/polymer":"^3.0.0-pre.13","@polymer/iron-resizable-behavior":"^3.0.0-pre.17"},"_hasShrinkwrap":false,"readmeFilename":"README.md","devDependencies":{"image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.17","@polymer/iron-component-page":"^3.0.0-pre.17","@webcomponents/webcomponentsjs":"^2.0.0-beta.2"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.17_1525756061111_0.22117423886340792","host":"s3://npm-registry-packages"}},"3.0.0-pre.18":{"name":"@polymer/app-media","version":"3.0.0-pre.18","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.18","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"b6aed825b10a4a44f471c513d5408ac80f6414be","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.18.tgz","fileCount":25,"integrity":"sha512-mlpLYHscDvJL/RFMteS40wTQzwF2Fsm5anWYobiYj5+Alse+etyJw/B5fo9+2k4EzFlpvWzZdfe9kqSaNscuCQ==","signatures":[{"sig":"MEYCIQDA7fUY/KoQ0Fl2hkOD8ISCvnqTJT1nX78rshec8m5QCwIhAPlJF1F6dqhpJi0XyGNqsAH8TNWdULVlFg4HdDID3fFS","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":111420,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJa8sajCRA9TVsSAnZWagAANK8P/0J9SsaSsmSvwVP8BL4G\nbas2SoD8DNQof2texAA9idBNOwKvphNALFvz+k6/fLl9dhd7pQLCM8a82qTx\nTuemekxrkAwkEEQuiIBly7So8IBOtmOh/rmLEHWmBf8r0MfIti2QeexCWyrk\nTI6RrVB6pEMQhSEqKnPmcpjHILOw6EG2eSNnlh3wnxRMifh9SKDJD33SZYAl\nx2n1by7TBNQRyIXaHkoxGaRBaabpoP7pS4snwUiyPTgiRixoValBNH1QuGSD\n1Tc8U3wRQu1o9QCc242znGq9NH2D9QJMO1/0XUgS61tKEloGinwv1OZ/CewP\ngz+vXuiIPf+qXgYXLh96v/3wgXpBMB+XlvXDBT6g0RJY2NG40QnglNvXazlm\nhkd5sup6lqUN3V82RIeURuZzGjvD3DNdwxk2VMNCtl8xUQMOmXYa9zNkzeAg\n3NpX2d/fREvJQwmnjP6p1PsjapIEPh1yzAj28ld5Mqf5Rn2PmN7Gup+j1+EC\nZrDyTpA4rN5eEivN7BNwV0cBTS9zovcc/YcyWSsTHsU7+lTVVO3FCjcH9uHb\n3VIdAXDq+qBFfUmWyZUlHx08UOyRJinTwMhK0djsj8JYk+5/nD2n7PkE4r2o\nDI9CH9AeuQx7SlYeimmdItTIxqES7vkh/pAhG0sQ30wTZwJEOHZXlSf5f16v\nOVf2\r\n=+Gge\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","readme":"## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n### Introduction\n\nModern web browsers support a set of APIs called\n[Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/). These\nAPIs allow you to access inputs such as microphones and cameras, and to a\nlimited extent they also let you record and visualize the data.\n\nBrowsers of yet greater modernity support an API called\n[MediaStream Recording](https://www.w3.org/TR/mediastream-recording/). This\nAPI makes recording and processing the data from these inputs really easy and\nfun.\n\nApp Media is a series of elements that wrap these APIs. The intention is to make\nit easier and more fun to build apps and websites that incorporate video and\naudio recording and visualization, while relying on standardized, highly\nperformant browser APIs.\n\n#### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by\n[WebRTC polyfill](https://github.com/webrtc/adapter) and\n[MediaStream ImageCapture API polyfill](https://github.com/GoogleChromeLabs/imagecapture-polyfill).\nThe following table documents browser support for the elements in this collection with\nthese polyfills in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚧 |      🚧 |    🚧 |    🚫\n\n### How to use\n\nMany apps that access cameras and microphones may wish to start by discovering\nwhat is possible on the current device. A laptop typically has only one camera,\nbut a phone often has one or two. A microphone is often present on devices these\ndays, but it's good to know for sure.\n\nBefore you begin, make sure that you are loading the\n[WebRTC polyfill](https://github.com/webrtc/adapter) where appropriate so that\nthe most up to date versions of the necessary APIs are usable in all of your\ntarget browsers.\n\n### `<app-media-devices>`\n\n`app-media` offers the `app-media-devices` element to assist in looking up the\navailable cameras, microphones and other inputs on the current device. You can\nconfigure it with a string that can be matched against the kind of device you\nwish to look up, and the element will do the rest. Here is an example that\nbinds an array of all available microphone-like devices to a property called\n`microphones`:\n\n```html\n<app-media-devices kind=\"audioinput\" devices=\"{{microphones}}\">\n</app-media-devices>\n```\n\nIn the example, the available devices are filtered to those that have the string\n`'audioinput'` in their `kind` field. It is often convenient to refer to a\nsingle selected device. This can be done using the `selected-device` property,\nwhich points to a single device in the list at a time:\n\n```html\n<app-media-devices kind=\"audioinput\" selected-device=\"{{microphone}}\">\n</app-media-devices>\n```\n\n### `<app-media-stream>`\n\nOnce you have found a device that you like, you'll need to access a\n`MediaStream` of the input from the device. The `app-media-stream` makes it\neasy to convert a device reference to a `MediaStream`:\n\n```html\n<app-media-stream audio-device=\"[[microphone]]\" stream=\"{{microphoneStream}}\">\n</app-media-stream>\n```\n\nHowever, sometimes you don't know which device you want to use. The Media\nCapture and Streams API allows users to\n[specify constraints](https://w3c.github.io/mediacapture-main/#constrainable-properties)\nrelated to the input device. For example, if you wish to access a camera stream,\nand you would prefer to get the back-facing camera if available, you could do\nsomething like this:\n\n```html\n<app-media-stream\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{backFacingCameraStream}}\">\n</app-media-stream>\n```\n\nYou can use `app-media-stream` to record a device screen.\n\nScreen sharing in Chrome and Firefox has some differences. See\n[this](https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/#why-screen-fails)\npage for more info.\n\nTo capture the screen in Chrome use `{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}`\nvideo constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nNOTE: As of today (April 23th, 2017), screen capturing in Chrome is available only on\nAndroid and requires enabling `chrome://flags#enable-usermedia-screen-capturing` flag.\n\nTo capture the screen in Firefox use `{\"mediaSource\": \"screen\"}` video constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mediaSource\": \"screen\"}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nYou can also use `{\"mediaSource\": \"window\"}` to capture only application window\nand `{\"mediaSource\": \"application\"}` to capture all application windows,\nnot the whole screen.\n\nNOTE: Firefox (before version 52) requires to set `media.getusermedia.screensharing.enabled`\nto `true` and add the web app domain to `media.getusermedia.screensharing.allowed_domains`\nin `about:config`.\n\nIt's easy to create a stream that contains both audio and video tracks as well.\nAny combination of devices and constraints can be used when configuring:\n\n```html\n<app-media-stream\n    audio-device=\"[[microphone]]\"\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{cameraAndMicrophoneStream}}\">\n</app-media-stream>\n```\n\nNOTE: Chrome doesn't support combining screen capture video tracks with audio tracks.\n\n### `<app-media-video>`\n\nSuppose you are planning to build an awesome camera app. At some point, you will\nneed to convert your `MediaStream` instance into video that the user can see, so\nthat she knows what is being recorded. Conveniently, you don't need a special\nelement to make this work. You can actually just use a basic `<video>` element:\n\n```html\n<video src-object=\"[[backFacingCameraStream]]\" autoplay></video>\n```\n\nOnce the `backFacingCameraStream` is available, the `<video>` element will\ndisplay video from the camera. But, without further intervention, the video will\nchange its size to be the pixel dimensions of the incoming video feed. If you\nare building a camera app, you may want a video that scales predictably inside\nof its container. An easy way to get this is to use `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay>\n</app-media-video>\n```\n\nBy default, `<app-media-video>` will automatically scale the video so that it\nis \"full bleed\" relative to the dimensions of the `<app-media-video>` element.\nIt can also be configured to scale the video so that it is contained instead of\ncropped by the boundary of `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay contain>\n</app-media-video>\n```\n\nNote that when using a combined stream of camera and microphone data, you may\nwish to mute the video in order to avoid creating a feedback loop.\n\n### `<app-media-recorder>`\n\nEventually you will want to record actual video and audio from the\n`MediaStream`. This is where the MediaStream Recording API comes in, and there\nis an element to make it nice and declarative called `<app-media-recorder>`.\nIn order to use it, configure the element with an optional duration and bind the\nstream to it:\n\n```html\n<app-media-recorder\n    id=\"recorder\"\n    stream=\"[[cameraAndMicrophoneStream]]\"\n    data=\"{{recordedVideo}}\"\n    duration=\"3000\">\n</app-media-recorder>\n```\n\nWhen you are ready to make a recording, call the `start` method on the element:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  createRecording: function() {\n    this.$.recorder.start();\n  }\n\n  // ....\n});\n</script>\n```\n\nThe `<app-media-recorder>` will start recording from the configured stream and\nautomatically stop after the configured duration. While the recording is taking\nplace, the element will dispatch `app-media-recorder-chunk` events that contain\nindividual data chunks as provided by `MediaRecorder` it the `dataavailable`\nevent. Once the recording is available, it will assign it to the `data` property\n(this will also update the bound `recordedVideo` property in the example above).\n\nIf you don't configure a `duration`, then the recording will continue until you\ncall the `stop` method on the recorder instance.\n\n### `<app-media-image-capture>`\n\nAn emerging standard defines the\n[Image Capture API](https://w3c.github.io/mediacapture-image/), which allows for\nmore fine-grained control of camera settings such as color temperature, white\nbalance, focus and flash. It also allows for direct JPEG capture of the image\nthat appears in a given media device.\n\nThe `<app-media-image-capture>` element offers a declarative strategy for\nconfiguring an `ImageCapture` instance and accessing the photos it takes:\n\n```html\n<app-media-image-capture\n    id=\"imageCapture\"\n    stream=\"[[videoStream]]\"\n    focus-mode=\"single-shot\"\n    red-eye-reduction\n    last-photo=\"{{photo}}\">\n</app-media-image-capture>\n```\n\nWhen you are ready to capture a photo, call the `takePhoto` method:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  takePhoto: function() {\n    // NOTE: This method also returns a promise that resolves the photo.\n    this.$.imageCapture.takePhoto();\n  }\n\n  // ....\n});\n</script>\n```\n\n### `<app-media-audio>`\n\nIf you are building a voice memo app, you may wish to access an audio analyzer\nso that you can visualize microphone input in real time. This can be done with\nthe `<app-media-audio>` element:\n\n```html\n<app-media-audio\n    stream=\"[[microphoneStream]]\"\n    analyser=\"{{microphoneAnalyser}}\">\n</app-media-audio>\n```\n\nWhen the `microphoneStream` becomes available, the `microphoneAnalyser` property\nwill be assigned an instance of a Web Audio\n[`AnalyserNode`](https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode)\nthat corresponds to the audio input from that stream. Any stream with at least\nonce audio track can be used as an input for `<app-media-audio>`.\n\n### `<app-media-waveform>`\n\nThere are many kinds of visualization that might be useful for demonstrating to\nyour users that there is a hot mic on their devices. `<app-media-waveform>` is\na basic SVG visualization that can suit a wide-range of visualization needs. It\nis very easy to use if you have an `AnalyzerNode` instance:\n\n```html\n<app-media-waveform analyser=\"[[microphoneAnalyser]]\">\n</app-media-waveform>\n```\n\nThe analyzer is minimal, but its foreground and background can be themed to\nachieve some level of customized look and feel:\n\n```html\n<style>\n  :host {\n    --app-media-waveform-background-color: red;\n    --app-media-waveform-foreground-color: lightblue;\n  }\n</style>\n```\n","gitHead":"049edadc598aa898bfca2e72c51a7b1eb6cae0e0","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.0.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2","@webcomponents/webcomponentsjs":"2.0.0-beta.2"},"_nodeVersion":"9.8.0","dependencies":{"@polymer/polymer":"^3.0.0","@polymer/iron-resizable-behavior":"^3.0.0-pre.18"},"_hasShrinkwrap":false,"readmeFilename":"README.md","devDependencies":{"image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.18","@polymer/iron-component-page":"^3.0.0-pre.18","@webcomponents/webcomponentsjs":"^2.0.0"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.18_1525860002586_0.1916950169548135","host":"s3://npm-registry-packages"}},"3.0.0-pre.19":{"name":"@polymer/app-media","version":"3.0.0-pre.19","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.19","maintainers":[{"name":"polymer","email":"admin@polymer-project.org"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"45235820de5ca4a6cbe1a093ee8c4ba48a2c3dcc","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.19.tgz","fileCount":25,"integrity":"sha512-IkfcQEBwg+Vmo1LHF1NKxPzirgx9j7M8+Ls079QEBPvGs1s5QCjHqnyynUvRq8B2Mz4J8OpvQ+5QQzyZP0uq5A==","signatures":[{"sig":"MEYCIQC/IbPDVKizl/BvDyCUUWaMsUgDTyYXq04Rwf/UBcOG2wIhAPOH9a9s9j9ol2OrIai7kmVEl/37j6hXfVTfBTaFv76Z","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":111420,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJa81gXCRA9TVsSAnZWagAA1N4P/jJE+CEflc5n4XkzfvsP\nIm4FMAYlF51Gmt0eB1I6lWsqWnS4IUGKdpNLJj0sh5XaZ7CmBpARpo5lqDXh\nS+p2Ive8AIgdFXXlgwzX+gZ7YLrTj7qun9ucgyeWJaYk38Ie0gW3LhnPPvs9\nHjAyNQrKWnxc6/zSelODOansIvRIVRmTiJPgjgwtlp+PDc2mosttF6/kf3zS\nuxIas98Cbv/baBbcKk6pOOASYUQI1ZNt4v0Qn6KGAKx7qNCaxicq++JZIgKQ\n+j3m7ls60FiCu8qfxAfuXKmrpQDuT0xSZqRK7Ic6OTWbKcaJrj8ffeuigH9Z\nEfTg+h1M7QGdvbuIKJSdwUPJ540PTcaaqYNN6PH5yKkRvtYOsDWQIlks0zZN\nQClLnRT7WcfOe/89C7vjcqRlUIg8lLm7YbfmQ3xJjrfqz0y1sAOUXRRjHq84\nQ1oB6eNjfSvTLHNMSWFYULemPqbzviAU7GBsrz8ClAuWWZBcckoht5SN2V8C\nSOrMvvw3DUZ0VgtA0RAZmnse/1Q1Dhzu0sDL2Qpk64xczAIwz2wt7HScbZVf\nPyWbhOo/PjY79rvTJ9nMxwBhgz7KG0xy3IpFWXHv3OvVTxOUSYTX/w9bNW4/\nJi0TVingXQqS9wnz+jcTFvXGXVjAmkJOe5FLU7FPKXfmguS2MiVc/nhbdIT5\nIpxv\r\n=273L\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","gitHead":"3b692b48a87d0c88943ff69b43687006362c82cd","_npmUser":{"name":"polymer","email":"admin@polymer-project.org"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.0.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2","@webcomponents/webcomponentsjs":"2.0.0-beta.2"},"_nodeVersion":"9.8.0","dependencies":{"@polymer/polymer":"^3.0.0","@polymer/iron-resizable-behavior":"^3.0.0-pre.19"},"_hasShrinkwrap":false,"devDependencies":{"image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^0.0.1-pre.11","@polymer/test-fixture":"^3.0.0-pre.19","@polymer/iron-component-page":"^3.0.0-pre.19","@webcomponents/webcomponentsjs":"^2.0.0"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.19_1525897237959_0.5378441179657971","host":"s3://npm-registry-packages"}},"3.0.0-pre.20":{"name":"@polymer/app-media","version":"3.0.0-pre.20","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.20","maintainers":[{"name":"aomarks","email":"aomarks@google.com"},{"name":"azakus","email":"dfreedm2@gmail.com"},{"name":"emarquez","email":"emarquez@google.com"},{"name":"keanulee","email":"npm@keanulee.com"},{"name":"notwaldorf","email":"notwaldorf@gmail.com"},{"name":"polymer-devs","email":"admin@polymer-project.org"},{"name":"usergenic","email":"brendan@usergenic.com"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"b3580c5d0d51dae4e17554e660dda4a513358486","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.20.tgz","fileCount":27,"integrity":"sha512-YZ7fsZOIcNJG5WtuwxX5W2e+ghX1hGKTquqTzuGoDH+WDweVaJQ/JUkYj3tWoJYxIq8gK3UJyL6MkG0gT0ttwA==","signatures":[{"sig":"MEQCID3ibIQPJOrQo305KEwqIXb4gk26UMzzm3XYn9ufyxKIAiByhCUayBunHpswecAVGZsMWQDqBB6xcL/P6iwwT/arEQ==","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":117835,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJbLZGsCRA9TVsSAnZWagAAYdIP/3lSqVWdXwbLWzqNcA5w\nhaCqNaB8pZzCgfN6pdKZBI6TneDlzBElsCZpUwCRDvyN3s0UCKsqyGaRsFFD\nvCVf0Nl3aFIuqFIZp52rUVyePb67QAAdXpi6fXpkiIWAMLl05pvR9kpmx6iy\nRysGuJfVsNMUoJWbY6QCjJQrUVzVJGU7+DEuLXbN2MYwoZVNGtNM8gzZVb8q\nLFvE/lFxp8ornsHIS4gThPKsXgeRsAT2NOw0yKVflYYj5mT9pMaXHrUUQMuL\nJPQ/dRHtyzlywX6bS/gt9NtljzEy3b0AIMMtd4ds6jO6a5hlHHqmy0Ur/SWj\n2aMXxrMlgtt80RWIxopioZ7LjGTy79cBseUseIaIDE28fB58Fci8asy7EqGu\nUbwl6tetCnDStAffNhp1g4+Ac/aSmQ+yCKig2LlGfCCtUUelj3KS+4fvmiMs\nUvuoo706CPCFKkN/+1KwExq6R7lXB6sm7N9Wf0FWaQS14uJdjSJ9te4d/Igz\n64bMFe+bPQWkg5VEikqWEJJjafkbVkxd691WOSw+jGmiqFOrvf4moshYsBQE\nuK+F+mLmsbZrtGkOWQuNGuMbGfXdh/CZFS6uYbKsooCQ4cMhdgFbEmIVZn48\nlKOGIS9FmZzJEnZLnOUxJUVdaR9EU/VE6Z9hH95rq4LVo/zIjb48SbPZAgGc\nWoVY\r\n=23j2\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","readme":"## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n### Introduction\n\nModern web browsers support a set of APIs called\n[Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/). These\nAPIs allow you to access inputs such as microphones and cameras, and to a\nlimited extent they also let you record and visualize the data.\n\nBrowsers of yet greater modernity support an API called\n[MediaStream Recording](https://www.w3.org/TR/mediastream-recording/). This\nAPI makes recording and processing the data from these inputs really easy and\nfun.\n\nApp Media is a series of elements that wrap these APIs. The intention is to make\nit easier and more fun to build apps and websites that incorporate video and\naudio recording and visualization, while relying on standardized, highly\nperformant browser APIs.\n\n#### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by\n[WebRTC polyfill](https://github.com/webrtc/adapter) and\n[MediaStream ImageCapture API polyfill](https://github.com/GoogleChromeLabs/imagecapture-polyfill).\nThe following table documents browser support for the elements in this collection with\nthese polyfills in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚧 |      🚧 |    🚧 |    🚫\n\n### How to use\n\nMany apps that access cameras and microphones may wish to start by discovering\nwhat is possible on the current device. A laptop typically has only one camera,\nbut a phone often has one or two. A microphone is often present on devices these\ndays, but it's good to know for sure.\n\nBefore you begin, make sure that you are loading the\n[WebRTC polyfill](https://github.com/webrtc/adapter) where appropriate so that\nthe most up to date versions of the necessary APIs are usable in all of your\ntarget browsers.\n\n### `<app-media-devices>`\n\n`app-media` offers the `app-media-devices` element to assist in looking up the\navailable cameras, microphones and other inputs on the current device. You can\nconfigure it with a string that can be matched against the kind of device you\nwish to look up, and the element will do the rest. Here is an example that\nbinds an array of all available microphone-like devices to a property called\n`microphones`:\n\n```html\n<app-media-devices kind=\"audioinput\" devices=\"{{microphones}}\">\n</app-media-devices>\n```\n\nIn the example, the available devices are filtered to those that have the string\n`'audioinput'` in their `kind` field. It is often convenient to refer to a\nsingle selected device. This can be done using the `selected-device` property,\nwhich points to a single device in the list at a time:\n\n```html\n<app-media-devices kind=\"audioinput\" selected-device=\"{{microphone}}\">\n</app-media-devices>\n```\n\n### `<app-media-stream>`\n\nOnce you have found a device that you like, you'll need to access a\n`MediaStream` of the input from the device. The `app-media-stream` makes it\neasy to convert a device reference to a `MediaStream`:\n\n```html\n<app-media-stream audio-device=\"[[microphone]]\" stream=\"{{microphoneStream}}\">\n</app-media-stream>\n```\n\nHowever, sometimes you don't know which device you want to use. The Media\nCapture and Streams API allows users to\n[specify constraints](https://w3c.github.io/mediacapture-main/#constrainable-properties)\nrelated to the input device. For example, if you wish to access a camera stream,\nand you would prefer to get the back-facing camera if available, you could do\nsomething like this:\n\n```html\n<app-media-stream\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{backFacingCameraStream}}\">\n</app-media-stream>\n```\n\nYou can use `app-media-stream` to record a device screen.\n\nScreen sharing in Chrome and Firefox has some differences. See\n[this](https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/#why-screen-fails)\npage for more info.\n\nTo capture the screen in Chrome use `{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}`\nvideo constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mandatory\": {\"chromeMediaSource\": \"screen\"}}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nNOTE: As of today (April 23th, 2017), screen capturing in Chrome is available only on\nAndroid and requires enabling `chrome://flags#enable-usermedia-screen-capturing` flag.\n\nTo capture the screen in Firefox use `{\"mediaSource\": \"screen\"}` video constraint:\n\n```html\n<app-media-stream\n    video-constraints='{\"mediaSource\": \"screen\"}'\n    stream=\"{{stream}}\"\n    active>\n</app-media-stream>\n```\n\nYou can also use `{\"mediaSource\": \"window\"}` to capture only application window\nand `{\"mediaSource\": \"application\"}` to capture all application windows,\nnot the whole screen.\n\nNOTE: Firefox (before version 52) requires to set `media.getusermedia.screensharing.enabled`\nto `true` and add the web app domain to `media.getusermedia.screensharing.allowed_domains`\nin `about:config`.\n\nIt's easy to create a stream that contains both audio and video tracks as well.\nAny combination of devices and constraints can be used when configuring:\n\n```html\n<app-media-stream\n    audio-device=\"[[microphone]]\"\n    video-constraints='{\"facingMode\":\"environment\"}'\n    stream=\"{{cameraAndMicrophoneStream}}\">\n</app-media-stream>\n```\n\nNOTE: Chrome doesn't support combining screen capture video tracks with audio tracks.\n\n### `<app-media-video>`\n\nSuppose you are planning to build an awesome camera app. At some point, you will\nneed to convert your `MediaStream` instance into video that the user can see, so\nthat she knows what is being recorded. Conveniently, you don't need a special\nelement to make this work. You can actually just use a basic `<video>` element:\n\n```html\n<video src-object=\"[[backFacingCameraStream]]\" autoplay></video>\n```\n\nOnce the `backFacingCameraStream` is available, the `<video>` element will\ndisplay video from the camera. But, without further intervention, the video will\nchange its size to be the pixel dimensions of the incoming video feed. If you\nare building a camera app, you may want a video that scales predictably inside\nof its container. An easy way to get this is to use `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay>\n</app-media-video>\n```\n\nBy default, `<app-media-video>` will automatically scale the video so that it\nis \"full bleed\" relative to the dimensions of the `<app-media-video>` element.\nIt can also be configured to scale the video so that it is contained instead of\ncropped by the boundary of `<app-media-video>`:\n\n```html\n<app-media-video source=\"[[backFacingCameraStream]]\" autoplay contain>\n</app-media-video>\n```\n\nNote that when using a combined stream of camera and microphone data, you may\nwish to mute the video in order to avoid creating a feedback loop.\n\n### `<app-media-recorder>`\n\nEventually you will want to record actual video and audio from the\n`MediaStream`. This is where the MediaStream Recording API comes in, and there\nis an element to make it nice and declarative called `<app-media-recorder>`.\nIn order to use it, configure the element with an optional duration and bind the\nstream to it:\n\n```html\n<app-media-recorder\n    id=\"recorder\"\n    stream=\"[[cameraAndMicrophoneStream]]\"\n    data=\"{{recordedVideo}}\"\n    duration=\"3000\">\n</app-media-recorder>\n```\n\nWhen you are ready to make a recording, call the `start` method on the element:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  createRecording: function() {\n    this.$.recorder.start();\n  }\n\n  // ....\n});\n</script>\n```\n\nThe `<app-media-recorder>` will start recording from the configured stream and\nautomatically stop after the configured duration. While the recording is taking\nplace, the element will dispatch `app-media-recorder-chunk` events that contain\nindividual data chunks as provided by `MediaRecorder` it the `dataavailable`\nevent. Once the recording is available, it will assign it to the `data` property\n(this will also update the bound `recordedVideo` property in the example above).\n\nIf you don't configure a `duration`, then the recording will continue until you\ncall the `stop` method on the recorder instance.\n\n### `<app-media-image-capture>`\n\nAn emerging standard defines the\n[Image Capture API](https://w3c.github.io/mediacapture-image/), which allows for\nmore fine-grained control of camera settings such as color temperature, white\nbalance, focus and flash. It also allows for direct JPEG capture of the image\nthat appears in a given media device.\n\nThe `<app-media-image-capture>` element offers a declarative strategy for\nconfiguring an `ImageCapture` instance and accessing the photos it takes:\n\n```html\n<app-media-image-capture\n    id=\"imageCapture\"\n    stream=\"[[videoStream]]\"\n    focus-mode=\"single-shot\"\n    red-eye-reduction\n    last-photo=\"{{photo}}\">\n</app-media-image-capture>\n```\n\nWhen you are ready to capture a photo, call the `takePhoto` method:\n\n```html\n<script>\nPolymer({\n  is: 'x-camera',\n\n  // ...\n\n  takePhoto: function() {\n    // NOTE: This method also returns a promise that resolves the photo.\n    this.$.imageCapture.takePhoto();\n  }\n\n  // ....\n});\n</script>\n```\n\n### `<app-media-audio>`\n\nIf you are building a voice memo app, you may wish to access an audio analyzer\nso that you can visualize microphone input in real time. This can be done with\nthe `<app-media-audio>` element:\n\n```html\n<app-media-audio\n    stream=\"[[microphoneStream]]\"\n    analyser=\"{{microphoneAnalyser}}\">\n</app-media-audio>\n```\n\nWhen the `microphoneStream` becomes available, the `microphoneAnalyser` property\nwill be assigned an instance of a Web Audio\n[`AnalyserNode`](https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode)\nthat corresponds to the audio input from that stream. Any stream with at least\nonce audio track can be used as an input for `<app-media-audio>`.\n\n### `<app-media-waveform>`\n\nThere are many kinds of visualization that might be useful for demonstrating to\nyour users that there is a hot mic on their devices. `<app-media-waveform>` is\na basic SVG visualization that can suit a wide-range of visualization needs. It\nis very easy to use if you have an `AnalyzerNode` instance:\n\n```html\n<app-media-waveform analyser=\"[[microphoneAnalyser]]\">\n</app-media-waveform>\n```\n\nThe analyzer is minimal, but its foreground and background can be themed to\nachieve some level of customized look and feel:\n\n```html\n<style>\n  :host {\n    --app-media-waveform-background-color: red;\n    --app-media-waveform-foreground-color: lightblue;\n  }\n</style>\n```\n","gitHead":"4c8db2843608bbbf5240864b96f4063374054516","scripts":{"format":"webmat","update-types":"bower install && gen-typescript-declarations --deleteExisting --outDir ."},"_npmUser":{"name":"emarquez","email":"emarquez@google.com"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.1.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2","@webcomponents/webcomponentsjs":"2.0.0-beta.2"},"_nodeVersion":"9.8.0","dependencies":{"@polymer/polymer":"^3.0.0","@polymer/iron-resizable-behavior":"^3.0.0-pre.20"},"_hasShrinkwrap":false,"readmeFilename":"README.md","devDependencies":{"bower":"^1.8.4","webmat":"^0.2.0","image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^1.0.1","@polymer/test-fixture":"^3.0.0-pre.20","@polymer/iron-component-page":"^3.0.0-pre.20","@webcomponents/webcomponentsjs":"^2.0.0","@polymer/gen-typescript-declarations":"^1.2.2"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.20_1529713059521_0.9564558606435725","host":"s3://npm-registry-packages"}},"3.0.0-pre.21":{"name":"@polymer/app-media","version":"3.0.0-pre.21","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.21","maintainers":[{"name":"aomarks","email":"aomarks@google.com"},{"name":"azakus","email":"dfreedm2@gmail.com"},{"name":"emarquez","email":"emarquez@google.com"},{"name":"keanulee","email":"npm@keanulee.com"},{"name":"notwaldorf","email":"notwaldorf@gmail.com"},{"name":"polymer-devs","email":"admin@polymer-project.org"},{"name":"usergenic","email":"brendan@usergenic.com"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"74cc5983e23d098fb74a8a651b1c3f59c19f3bae","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.21.tgz","fileCount":27,"integrity":"sha512-/elL7AkONzTFYL3e7/KNzKZQhG92hdw/DDPQpURFsnreIRlslCEn1rSSvpFjOhfezWy7MSKDORlNrTovozKI0A==","signatures":[{"sig":"MEUCIAxzyyFKZuE+vBPNs/CK7WPct+gbN5KuJT6xQYAVRFkPAiEAwuvGhdyU3ZWgq4DpIQTnmgCzDt3jWkam0tz22gIQzco=","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":117835,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJbLZzwCRA9TVsSAnZWagAAZ/kP/j7drh9sF1MsRO48Htkz\nF+AAqRTpvXBYzFKcwyJyWU49mYMb+Y583rO9tHWT7DbWrsnqygoJV2RmrgxK\nhjzSxNi12JO1EfY6F07XuAIhJ8/Pzg2NmnoOrxkj2hLZln9Y7KMxSVw/QHzJ\n9uziaf2NJ9d/0zwft6yBvwRitIDf3MVYwIkVc1gnaH2Dvok6XEkRO3VB9XpP\nCTN/e1UKPsSPvQ/DQIU5eVPkaLCxAvIM7FMIe9xOVUFyBMxNtWYgZDBM2bVf\njDhxS8+wRQPoXNzbObHbEVfnjFHy5jjqENuKn1AQhi152GtFWngKEcrptOT9\nOc5+jeH0nhHVsPUD4mUEt5O4LsDHjtd9KKY+45UNFUKKB2fxOJLojT2ntZaF\nRAndZX/MoFAfAfdI0o+/LYqY9HwgQkaHzq2BePmnttjJVzlx4qqN+xGLa3hB\nt01vN7DWxul8VNU3eFY3Cy8uMdescSgpS7LNCT9SMGWKpq2HqtZUTyziUzWZ\n5/sTc5Fr1RpCYTacZcCkHXLJ7rb9qaNgNhci/uY9oze0fg1PyBxuYIaBCuZh\n4Lm9ZLGr+SWrNumIsGAI58oDUwiSPp3OQVmJudWzp/Mh3277P87rq2PQq34K\njsMV3N7S7jqc3XUiyuq+fCKez23Iv4Gol89Z+4jcWy9SRxXdFwEMpv2ocz3O\ngsY5\r\n=utCq\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","gitHead":"323d107892ae027f89b855c35dfca129873b9935","scripts":{"format":"webmat","update-types":"bower install && gen-typescript-declarations --deleteExisting --outDir ."},"_npmUser":{"name":"emarquez","email":"emarquez@google.com"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.1.0","description":"Elements for accessing data from media input devices","directories":{},"resolutions":{"samsam":"1.1.3","inherits":"2.0.3","type-detect":"1.0.0","supports-color":"3.1.2","@webcomponents/webcomponentsjs":"2.0.0-beta.2"},"_nodeVersion":"9.8.0","dependencies":{"@polymer/polymer":"^3.0.0","@polymer/iron-resizable-behavior":"^3.0.0-pre.21"},"_hasShrinkwrap":false,"devDependencies":{"bower":"^1.8.4","webmat":"^0.2.0","image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^1.0.1","@polymer/test-fixture":"^3.0.0-pre.21","@polymer/iron-component-page":"^3.0.0-pre.21","@webcomponents/webcomponentsjs":"^2.0.0","@polymer/gen-typescript-declarations":"^1.2.2"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.21_1529715944343_0.8876569339160703","host":"s3://npm-registry-packages"}},"3.0.0-pre.22":{"name":"@polymer/app-media","version":"3.0.0-pre.22","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.22","maintainers":[{"name":"aomarks","email":"aomarks@google.com"},{"name":"azakus","email":"dfreedm2@gmail.com"},{"name":"bicknellr","email":"bicknellr@gmail.com"},{"name":"emarquez","email":"emarquez@google.com"},{"name":"keanulee","email":"npm@keanulee.com"},{"name":"notwaldorf","email":"notwaldorf@gmail.com"},{"name":"polymer-devs","email":"admin@polymer-project.org"},{"name":"sorvell","email":"sorvell@google.com"},{"name":"usergenic","email":"brendan@usergenic.com"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"613e9252d18248daa85a570510df7d1b86dd4632","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.22.tgz","fileCount":28,"integrity":"sha512-mIOJUBlbQJYFCWDGf/XTzRx+UmWcVFf3sF46T2ml/ngg+r6MSkqHnMj+VwfFe/Hrc3d+Oo/SEeuB/S1XO4cnTQ==","signatures":[{"sig":"MEYCIQCDT6trOYoV2dUji2cwdVP3ICe5s7MTVZabAQGK5EorVgIhALdI+iC9FpVr5F+XjV5yT7Og9BAur9qKSCA9d6dRM0PZ","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":128309,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJbgMuWCRA9TVsSAnZWagAAKOAP/3Ldp7e/f2I6I7EIWdQw\n68OcW3ZN2CsM9+qVXQAgv6fRRnNJwazME2ESCbrmPsBNX5BbmhaM2EzuRsom\n5BB03ERSjSW/qytWmbUEY0lnKyHGoGQJZV7XjNikM78EvzOjRGZiHRs4op6I\n4WF1o7z/saCPWg4hrVdJo0BWN0DZIAxcA699mOlaoGh6kVck90thyUNk4IPP\nVmIzVouNS+8vILI6lan8lq/TIkCxQXJUBPWbzkZ3c7AbUNF3pEkFj5rTQmw4\np8ACMbUXax+QVac0oH9/iepnxTug3hIfnyMeg9c6BdtvAdFvLs4FBWDrTUS8\n6Rjx3G3q0qQHvXL3UEj1Pn823hSQxxD1CPioOz/knTBsCWNIpr4SSbPlOjB4\nAp3XUhyJgAwlYeijg6TOQUaszzEZ1yLtYNavPg6mK8/KOmum6EjeiDSAO3WG\n+h3QNSbj7IngFNLI3B3dbtLa4bPoMSdSkW0Qe1FYJJvP22+JB/ZoHTL1RF+g\nsDx4p7A2cQUJB+FAD/fAbgWthK4ZuquuK9sFatOuOfSinfrt3DBwSWIL1ZI3\nHWkEjR+vadZ1ydMo3KMvLxk4Gi88G2Folucm4DDIyyLwzw9p56Kd3XsuSgwQ\n0nphGEMW3RcDK6PDCVV7+fsCmegsnDVDr01YF0NjL1N+lfe/7pZjMlnr4E4z\nKy4d\r\n=bcWy\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","gitHead":"c0ab8af14c062405fe3c0f5cdd59cfb0b1a72754","scripts":{"format":"webmat"},"_npmUser":{"name":"bicknellr","email":"bicknellr@gmail.com"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.4.0","description":"Elements for accessing data from media input devices","directories":{},"_nodeVersion":"10.9.0","dependencies":{"@polymer/polymer":"^3.0.0","@polymer/iron-resizable-behavior":"^3.0.0-pre.20"},"_hasShrinkwrap":false,"devDependencies":{"webmat":"^0.2.2","image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^1.0.1","@polymer/test-fixture":"^3.0.0-pre.20","@webcomponents/webcomponentsjs":"^2.0.0"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.22_1535167381840_0.5494935174030395","host":"s3://npm-registry-packages"}},"3.0.0-pre.23":{"name":"@polymer/app-media","version":"3.0.0-pre.23","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.23","maintainers":[{"name":"aomarks","email":"aomarks@google.com"},{"name":"azakus","email":"dfreedm2@gmail.com"},{"name":"bicknellr","email":"bicknellr@gmail.com"},{"name":"emarquez","email":"emarquez@google.com"},{"name":"keanulee","email":"npm@keanulee.com"},{"name":"notwaldorf","email":"notwaldorf@gmail.com"},{"name":"polymer-devs","email":"admin@polymer-project.org"},{"name":"sorvell","email":"sorvell@google.com"},{"name":"usergenic","email":"brendan@usergenic.com"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"697abf9f2ec698353a8b03593f8fb3432d2ff45d","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.23.tgz","fileCount":28,"integrity":"sha512-9ApciNJfsuEcyra4DE1PZypMtSDTFVHA/oPdcyXQWbtnDrRka0s9fNj0XafuAqdHuuozatt+ukkqeRjZEROdhg==","signatures":[{"sig":"MEYCIQCb6k/82yXwbnYjUBugOKJCn6BoCPAGTzaWDPe/Mi3ZQAIhAPh0fglUwC4vX9K5DdP0e0sxINPHGRTfa7MMzqvElhqg","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":128308,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJbgQITCRA9TVsSAnZWagAAomoP/R24mwcOjgvCgh10eeSB\nirGA5VQWzxLzi0PH/k7ZgIhMLalzQks85nthwF19OH2WmiVzly2Hmshe06ch\n2aBwJHzI8x01pVCRd+T3GbMxvF3nqoAUz7vMY6vmbXbmENLCNm/HxJmDyHiM\nxGjM2TqxX/3Y+6ICmHM192OaQbSt/xp62MwdyIkNuonQnrCd5mvesmWQg7Ju\nqzpyXEiHpZKRFKR59iTKGyQS7UnnTOjtl2a++gD+lpECF6wjdwyatQ3u0uF0\nNy6w68algqky8InelLxm/BrYnkyslek1BBjF/a/p/9oLDLjoslrxbUyxxWyD\nECSdGz2+kYV44aa6T+XOAxIfPBD4noomWfvh1Ny+Gk0DMQ6m9pAijD4UcgNy\nAsHQH6wYOHZztfjbrQPy3MQGazUE9AtEdPFSzA7molyODQME1bQMinZLRQ+u\nJpFFWNESBjShHLd93+LqVpZ7UwZjP0LHKbzHtpIngvIhr8DIlv+BqxnGIeuO\njNKUo9YXjOAUcS59hdJ+P5JUkTrzzHjRZng+yFYD07reHMWgG1et1mjlxJOu\nkWisI9VGkIdsF61H9LOG/W0YI8UDyC5TwnFRCI6c4XWhUQTSjAmiji2vFPJK\nXFEJ9qodI71CbaKQgb/cyycd2l+wUEtX98YkPmcmYrXwxiqTaF3CSg2uo9tI\nDU5o\r\n=CBHW\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","gitHead":"c88ffc64526dcd35471e84293c1be457d9c6604a","scripts":{"format":"webmat"},"_npmUser":{"name":"bicknellr","email":"bicknellr@gmail.com"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.4.0","description":"Elements for accessing data from media input devices","directories":{},"_nodeVersion":"10.9.0","dependencies":{"@polymer/polymer":"^3.0.0","@polymer/iron-resizable-behavior":"^3.0.0-pre.22"},"_hasShrinkwrap":false,"devDependencies":{"webmat":"^0.2.2","image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^1.0.1","@polymer/test-fixture":"^4.0.0-pre.0","@webcomponents/webcomponentsjs":"^2.0.0"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.23_1535181330128_0.18583339635687812","host":"s3://npm-registry-packages"}},"3.0.0-pre.24":{"name":"@polymer/app-media","version":"3.0.0-pre.24","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.24","maintainers":[{"name":"aomarks","email":"aomarks@google.com"},{"name":"azakus","email":"dfreedm2@gmail.com"},{"name":"bicknellr","email":"bicknellr@gmail.com"},{"name":"emarquez","email":"emarquez@google.com"},{"name":"justinfagnani","email":"justin@fagnani.com"},{"name":"keanulee","email":"npm@keanulee.com"},{"name":"notwaldorf","email":"notwaldorf@gmail.com"},{"name":"polymer-devs","email":"admin@polymer-project.org"},{"name":"samli","email":"sam@sam.li"},{"name":"sorvell","email":"sorvell@google.com"},{"name":"usergenic","email":"brendan@usergenic.com"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"1b25d4e920ab55e7d1d123de1e22f16c873baeda","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.24.tgz","fileCount":28,"integrity":"sha512-8W2dvyjyeYI2UTkiIlb+P4QreIhpL+w5d58RlTf4mEf14dWPZoAFeTfP2ne/85BhXFWWduVQrpugm7md9/DWuw==","signatures":[{"sig":"MEUCIDIlzxx4Kh+4iIiUTLO8Htcu6SrsQsQZ6zXNG4GpHJFJAiEA8OFScjggbguXdH58X0HFo3bDyHNSzA/wzyiqWp+R5O4=","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":128365,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJbhgmcCRA9TVsSAnZWagAAvyoP/jMxDi7DZk/HmeVc+hYs\nKf7nU9YDi9YS0ihWCTEGfqo0diYlL/4xwejjcu3YxGMaL0qTPhNRHZ32dV3L\nmHggAdHvMA39xBXKx6PNFeuaczvWXuGiXttchmYkg+uO9iHXs7jrIkk7yNxv\nm9nIz1Xz0iuPZteJSCC9VWTCz5aaLTaESqHEKsBuCKyWwDjVJ88/fjStoyQ4\n57gzy92M1mbFUmwbxDFD2aUEq2ymzHJOWLjcy+xdh3YkllDVQUCx21T4Gid3\n8lGiyFW1vj9A6+5kmtV2XKdQS4ksgoBqZWYlt2MlJBIwKq4U4+02qsH1rqQe\npshNTXeZD2pZeAEKAx5e5i54INtfOfJdHhIlwIjjT8a/NitvXlXawn6vO4/S\nwfz0LAqP63YG9lbYst4r50fJvEZEiJGGkw3uhEWnFveH3t6u6vLi4KYNvIqa\niheOdR37hnw8xKTZaskX3WnqwR5AZI04fxQFCbBTncbNNxUUQo89LL1QMYHS\nY9Vgkz+isjfEh75fFi9Tef9CS+3f72gK76Mvu+1ERVPnFJGZTASlOlh4yN5A\nHqqxp9FvpH1WTtQVRV5gVpduNB4YVwo/tkuwvokfW93yZAiq7A+m/Uy4aRsk\nuaAokLEbN/iN4KkmtH5/4IdAyW7thLcBtF3hkg4QXsIzv2bCWGlopTxzytve\nVlRJ\r\n=iTNC\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","gitHead":"5c086653333b44859e74ad426ddf7b7fd20581d6","scripts":{"format":"webmat"},"_npmUser":{"name":"bicknellr","email":"bicknellr@gmail.com"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.4.0","description":"Elements for accessing data from media input devices","directories":{},"_nodeVersion":"10.9.0","dependencies":{"@polymer/polymer":"^3.0.0","@polymer/iron-resizable-behavior":"^3.0.0-pre.22"},"_hasShrinkwrap":false,"devDependencies":{"webmat":"^0.2.2","image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^1.0.1","@polymer/test-fixture":"^4.0.0-pre.0","@webcomponents/webcomponentsjs":"^2.0.0","@polymer/gen-typescript-declarations":"^1.5.1"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.24_1535510939848_0.02283943172396019","host":"s3://npm-registry-packages"}},"3.0.0-pre.25":{"name":"@polymer/app-media","version":"3.0.0-pre.25","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.25","maintainers":[{"name":"aomarks","email":"aomarks@google.com"},{"name":"azakus","email":"dfreedm2@gmail.com"},{"name":"bicknellr","email":"bicknellr@gmail.com"},{"name":"emarquez","email":"emarquez@google.com"},{"name":"justinfagnani","email":"justin@fagnani.com"},{"name":"keanulee","email":"npm@keanulee.com"},{"name":"notwaldorf","email":"notwaldorf@gmail.com"},{"name":"polymer-devs","email":"admin@polymer-project.org"},{"name":"samli","email":"sam@sam.li"},{"name":"sorvell","email":"sorvell@google.com"},{"name":"usergenic","email":"brendan@usergenic.com"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"0bf7156120019a81d8850fad224a4d948dd973ef","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.25.tgz","fileCount":28,"integrity":"sha512-N8M61wMdjyNGyxtO4mZcbl5b4M/Tty7IF2VWreCt3qIY2TSbXfLKCXh77aiRwZWTlvJc6eVRhRs9oCejRnuQpg==","signatures":[{"sig":"MEYCIQDsNI9dUrWO+U5kEfXgRPg7KGmTTOevLB5DQJ5AHqpqpgIhAIEdySGpSjQIZ/mb/kkDf29nBq/aSijuSjc8VZXraUmN","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":128365,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJbhhrHCRA9TVsSAnZWagAA7+AP/jdpanVJyVuVRVGMADMv\n6pnY0gnVSvLs5aCYjPYXAsM2s/bL++haMIXyovFA/T34yy1seXuAV05/kDFZ\nRN5ivf8ySo4JEQs4FFbzvwwpYpggQ9SsbcHSiO8nwknrfs1VJ/DNBAdA4lb6\nJxP50rGRM2QjqdkfMquxGlNCRYXEV4DD8NM4AgfK4zUFub7CUR7fB9i8tEIv\nc/kPA5lrHtx4C273jnozrgUtHDCKb9VBCv0oUBtrPldKUa2WCGG6gNArTDKy\n687uP9NmJpg5Ksg6+Bg0c8HdEp7F3kbcu7HLHCEjh32DB3ZfhEj2vFfY9zf1\nPyd0FA2Y47ekjpHCTHk4DK0y9RwNECTw5MdCyffvTB4xmBvT536uZoCGZ5SH\nm0KJ2DdTNLmn1O+tYp6Uo57aQ9SqQzdtL1oddYccrztEpy8OnbD1OBr7qW7+\nPbzlkdoPuhHVYoYFhylU5N4+lnyGJ3GutpWQrrrrU5nZ4QGuHgtheCgwQCoc\nbSYuk+QG10bhXKwExDZ4mlGgIKZYTQwSJtGEsgZcPkFGRbo9xMYSlZKXpcJg\n7KmigVII9vjHZPg8pSTdtFXJg/EgXJhuo1Dly94MiPqcI2iRmJXnNP6cFQdD\nBcwDc10Q41QSj3dTmeImFARqqdAE5y+gd+gJv0xuMoUvzJ4iqWkbxKddxtiu\naNxf\r\n=jxG8\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","gitHead":"f5a869b641ad124172763898e68c9fa46a4bb0f8","scripts":{"format":"webmat"},"_npmUser":{"name":"bicknellr","email":"bicknellr@gmail.com"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.4.0","description":"Elements for accessing data from media input devices","directories":{},"_nodeVersion":"10.9.0","dependencies":{"@polymer/polymer":"^3.0.0","@polymer/iron-resizable-behavior":"^3.0.0-pre.24"},"_hasShrinkwrap":false,"devDependencies":{"webmat":"^0.2.2","image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^1.0.1","@polymer/test-fixture":"^4.0.0-pre.2","@webcomponents/webcomponentsjs":"^2.0.0","@polymer/gen-typescript-declarations":"^1.5.1"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.25_1535515334481_0.09566748222748922","host":"s3://npm-registry-packages"}},"3.0.0-pre.26":{"name":"@polymer/app-media","version":"3.0.0-pre.26","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0-pre.26","maintainers":[{"name":"aomarks","email":"aomarks@google.com"},{"name":"azakus","email":"dfreedm2@gmail.com"},{"name":"bicknellr","email":"bicknellr@gmail.com"},{"name":"emarquez","email":"emarquez@google.com"},{"name":"justinfagnani","email":"justin@fagnani.com"},{"name":"keanulee","email":"npm@keanulee.com"},{"name":"notwaldorf","email":"notwaldorf@gmail.com"},{"name":"polymer-devs","email":"admin@polymer-project.org"},{"name":"samli","email":"sam@sam.li"},{"name":"sorvell","email":"sorvell@google.com"},{"name":"usergenic","email":"brendan@usergenic.com"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"d2d850e805f93c88cc691814f54158cb985b1b19","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0-pre.26.tgz","fileCount":28,"integrity":"sha512-eCxMNfGQnv6GZ1UhYqKP8lDZOZPZqt+sCHxP9hVNYeVT3I9taSBKFc31Vq6/We8zsJfQOIO5r5ClMqdPa6ATSw==","signatures":[{"sig":"MEUCICuTYv1CMy8DTw2BMNzzVS/OB8JEj6codsE8C0xg99PyAiEAgXw66s6Q7c+UjQ6F8NQ+md2jkFPgf+t3Fs3fpjp4vH4=","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":128359,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJbmGhxCRA9TVsSAnZWagAA7DoP/17/hpgtuqx8Qvu09YEr\ne4dsNmGIAl6s+r02XYn1h1O30neB5aUTE3xkV91Sh7OV9dn9+ue7ZHZsMfXh\nEaFbg0rkLj295ITGWoG00S6AcpEARVpACycpBPo84IYwJzsWGsnAbttI0BIt\nRA2d88LU2Le1/qkcn9MYmOcHC9j9K9UsNABK0RjAZ+xgp0pQYHmeePx/Ym0H\nzrgr1uwuojTDgpsTvvI1PJIavwYoXRSCQK4HggqDM8aOwegjXoI+6xpDmObf\nDGYtG0uZJiCR6w4Cchz/Dxs893+s6QT4gQU1h1d9J0qxpWAeG/OIbH9Fk8Ti\n4ozgb/arNEGqeD1dDC2YGQxclSsabLlCXkecLT9quAaMFPU/Y+9O0GqYxFbl\njJrJRbK3Smdw2MAuX13y8WgLC8qQS3JnmR1iG3tHEiR4eGTca85frhhA6NV1\nQLNQCXyFKoEtNdDkx3NfCF3T1RHW0hF4lYgXsCm329GIn4B7g6RprTAgXWH3\nDmxsZ/14K4aPGd7dzZc2ewqoft8Q+TuDPot46ca3qPNPWADOC3oKNyWnhAGx\nfAWAklrZOMgx1qUe9LinBd2vJ/xQQQOdtXVPzNNm6MTirad+EeKf0dEJyMjw\niDGq2Q2ngLQa1YfkiQ3VQARHzK0o3fnWNon0a/qzJUb2lixwgoEA+xAz6nn/\nj8EK\r\n=CYCU\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","gitHead":"27e3982e04592ec30ca07f3a8c65fa43ee77efc0","scripts":{"format":"webmat"},"_npmUser":{"name":"bicknellr","email":"bicknellr@gmail.com"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.4.1","description":"Elements for accessing data from media input devices","directories":{},"_nodeVersion":"10.9.0","dependencies":{"@polymer/polymer":"^3.0.0","@polymer/iron-resizable-behavior":"^3.0.0-pre.25"},"_hasShrinkwrap":false,"devDependencies":{"webmat":"^0.2.2","image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^1.0.1","@polymer/test-fixture":"^4.0.1","@webcomponents/webcomponentsjs":"^2.0.0","@polymer/gen-typescript-declarations":"^1.5.1"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0-pre.26_1536714864811_0.10769475260408656","host":"s3://npm-registry-packages"}},"3.0.0":{"name":"@polymer/app-media","version":"3.0.0","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.0","maintainers":[{"name":"aomarks","email":"aomarks@google.com"},{"name":"azakus","email":"dfreedm2@gmail.com"},{"name":"bicknellr","email":"bicknellr@gmail.com"},{"name":"emarquez","email":"emarquez@google.com"},{"name":"justinfagnani","email":"justin@fagnani.com"},{"name":"keanulee","email":"npm@keanulee.com"},{"name":"notwaldorf","email":"notwaldorf@gmail.com"},{"name":"polymer-devs","email":"admin@polymer-project.org"},{"name":"samli","email":"sam@sam.li"},{"name":"sorvell","email":"sorvell@google.com"},{"name":"usergenic","email":"brendan@usergenic.com"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"c779add85ff19143cc6865928ec1c5f780ee690f","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.0.tgz","fileCount":28,"integrity":"sha512-kJrKo5H5zd7Tlm4VpRSd8lohftznaZmT4lV17mAvautMYhZjlukQOCm1BP60anOT/IK4FmDTVeJUqZSQXcS8OQ==","signatures":[{"sig":"MEQCIAC4ZRaGs+DGKuufTRWlIsCZcKTAeIEFcw7OAcZNsihEAiB6S2SRvPbM+l4Gn11NJ8Sh5+uv/S9eUiwCTGaDDfvQeg==","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":128352,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.4\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJbmeERCRA9TVsSAnZWagAAUoMP/2pdeL7AzK0KaIywOi9o\nmgWEsErpPJ8EoQrfW0Yuv8SoDZ7H6e9SYJL+Qr/CdVk8KH1G8yMQ1Cebyx8e\nt5aWNsD78afLQ3vSrkDvpmp5912os9YVJk8RZ3l9lMF1H1R0dsUDuT0+ZBus\n9maBDbCDjlGCctEK5h8PNbZwbB9srIGYbB0VIlBq9e/mj0+thgoMMaXYuO37\n/OgZgc8UB9We4aNKDqz0ppih5zbg+CILEy6prcyZnb7+V9SvaXgNrMZFXq0/\nqQJLYXcKSDIbmbpeXoSNNs5hY2wnBTaYrGR2hXBh/QIJr14sPmhuNJgEvNq8\n7pm0hO8BpjPEjmDnFvItJkdNyWnsUia2dRRBkNodHN6uYBiar2BFQWFpKA9P\nwYHlbGKXwocosIBJi973AVve3z9TzXJwNI2gBbItmbuoMV6umYDscK+vOfyv\noJbhxZHko1+fjUFCbTETFPTNVXUcskShT6kvV26O1fkmjqozHNW2tfjAMIeW\nxPU9MBCX0JssY2BHJJOPbwJ5NyOwmZd4RQlrg8Uh1Yvy02Jtk0/vhHqJHQBV\nCmdig9fA8F0iRJVjQwl7FJ3nSKzZkddYgCVH7+/Ec6s8onAXkPY9fEu5GVqY\nP8a4EF9hkuMcKFT72T+OxfPy9YvzI64he7Mk51XmB2WG/gSuPFmN1UHkJ1yd\nCLXW\r\n=Erq1\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","gitHead":"abff8cef1c94446c55ff658590f72f05a79d9d9a","scripts":{"format":"webmat"},"_npmUser":{"name":"bicknellr","email":"bicknellr@gmail.com"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.4.1","description":"Elements for accessing data from media input devices","directories":{},"_nodeVersion":"10.9.0","dependencies":{"@polymer/polymer":"^3.0.0","@polymer/iron-resizable-behavior":"^3.0.0-pre.26"},"_hasShrinkwrap":false,"devDependencies":{"webmat":"^0.2.2","image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^1.0.1","@polymer/test-fixture":"^4.0.1","@webcomponents/webcomponentsjs":"^2.0.0","@polymer/gen-typescript-declarations":"^1.5.1"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.0_1536811280352_0.7750007121852975","host":"s3://npm-registry-packages"}},"3.0.1":{"name":"@polymer/app-media","version":"3.0.1","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","_id":"@polymer/app-media@3.0.1","maintainers":[{"name":"aomarks","email":"aomarks@google.com"},{"name":"azakus","email":"dfreedm2@gmail.com"},{"name":"bicknellr","email":"bicknellr@gmail.com"},{"name":"emarquez","email":"emarquez@google.com"},{"name":"justinfagnani","email":"justin@fagnani.com"},{"name":"keanulee","email":"npm@keanulee.com"},{"name":"notwaldorf","email":"notwaldorf@gmail.com"},{"name":"polymer-devs","email":"admin@polymer-project.org"},{"name":"samli","email":"sam@sam.li"},{"name":"sorvell","email":"sorvell@google.com"},{"name":"usergenic","email":"brendan@usergenic.com"}],"homepage":"https://github.com/PolymerElements/app-media","bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"dist":{"shasum":"d8dd9328e26e82ee48a928777df9c135595ddd1a","tarball":"https://registry.npmjs.org/@polymer/app-media/-/app-media-3.0.1.tgz","fileCount":29,"integrity":"sha512-G58tmppojxO9mPrm9iB+UTvewbkwbGN4AlC3qo9vimoCpbPnF8dui5ZO0YPIX0ibQXi+1wM3/EEcr4UtNlv91w==","signatures":[{"sig":"MEQCIEabUZ4/T3m0/89/7+0Yv3+yqrvoryCEp6jYXMD2SzuhAiBpx4wip2DWmGU4nhwnUzkln9P4OfVigrYk8id4oYuylQ==","keyid":"SHA256:jl3bwswu80PjjokCgh0o2w5c2U4LhQAE57gj9cz1kzA"}],"unpackedSize":128431,"npm-signature":"-----BEGIN PGP SIGNATURE-----\r\nVersion: OpenPGP.js v3.0.13\r\nComment: https://openpgpjs.org\r\n\r\nwsFcBAEBCAAQBQJh2WyQCRA9TVsSAnZWagAAiL8QAJYvQPVr5uq0sIP6c5SB\nc1ZKX/AOezP4hyn6ZSU7LTkY8Wz2C3tTZQPhrJoUoNff5qYLFMjVdSwUOFAX\nF4E2rwFl2I6gZwO03B8iNTa797iytLp2nKffZuUx89hKMaNwrFGGE3yZZFXn\nwbYODO36sUDt1C0YXdclYUPeXcUfVNQJHEkh6LJwQbQ6xcXvXwh1mFFrNoDY\nRIfrx9xFOo/n73FSd848hFyXJLWIpUdmWjg30bX9/L3DmnhQ878e7IsMbmLc\nwsB3I37lkGBYcqWn0O6gmm10v9woi6Cg4WZOebC+HITtU4GQexgYaNpVFUj/\nfgrXTDiVt9yCAdNBzWdUWtikd07pwKNDqCARNavecp6K9HHhxzV9anHJLz37\nMIgroz/94PJ9yb5SyFGbLx+T1Hv31lqIOqHov8dxui/2irVtTT2JCGfb/s9S\nmWri9HHP8uCKmbXT48A7gMTAHv3pPsE0BlKAEBV11YCrs4CULrfKL1y601uf\nTAxlkW4OEPUvjPh27TVdyphk3y11q8pe5a4nF6QkjOEiAyglrMq46EHT17sr\nmNBzEb0QJX2ppYal14BOzPg8yQFRk8LYiUq2LUiLkxkO8rFNqNHPgL2BKRZo\nJ2RvFehjJod2xsQ5dnQYR2ATDmFFpKzwa/8LpFlr9fm2MHFRFoRc5uieOwlK\nAGfq\r\n=0rEY\r\n-----END PGP SIGNATURE-----\r\n"},"main":"app-media.js","gitHead":"08caa78721629cc4ad0f71714c5fbf4548334a6f","scripts":{"format":"webmat"},"_npmUser":{"name":"bicknellr","email":"bicknellr@gmail.com"},"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"_npmVersion":"6.4.1","description":"Elements for accessing data from media input devices","directories":{},"_nodeVersion":"10.9.0","dependencies":{"@polymer/polymer":"^3.0.0","@polymer/iron-resizable-behavior":"^3.0.0-pre.26"},"_hasShrinkwrap":false,"devDependencies":{"webmat":"^0.2.2","image-capture":"^0.3.0","webrtc-adapter":"^3.0.0","wct-browser-legacy":"^1.0.1","@polymer/test-fixture":"^4.0.1","@webcomponents/webcomponentsjs":"^2.0.0","@polymer/gen-typescript-declarations":"^1.5.1"},"_npmOperationalInternal":{"tmp":"tmp/app-media_3.0.1_1536892925779_0.25562544923726294","host":"s3://npm-registry-packages"}}},"time":{"created":"2018-01-24T23:31:20.990Z","modified":"2024-10-07T21:25:47.404Z","3.0.0-pre.6":"2018-01-24T23:31:20.990Z","3.0.0-pre.7":"2018-01-30T18:48:28.291Z","3.0.0-pre.8":"2018-02-07T18:56:23.993Z","3.0.0-pre.10":"2018-02-22T23:15:12.443Z","3.0.0-pre.11":"2018-03-09T01:17:06.368Z","3.0.0-pre.12":"2018-03-22T16:56:28.722Z","3.0.0-pre.13":"2018-05-01T22:56:05.074Z","3.0.0-pre.14":"2018-05-02T00:36:43.899Z","3.0.0-pre.15":"2018-05-02T20:53:42.721Z","3.0.0-pre.16":"2018-05-04T20:17:12.804Z","3.0.0-pre.17":"2018-05-08T05:07:41.210Z","3.0.0-pre.18":"2018-05-09T10:00:02.688Z","3.0.0-pre.19":"2018-05-09T20:20:38.031Z","3.0.0-pre.20":"2018-06-23T00:17:48.369Z","3.0.0-pre.21":"2018-06-23T01:05:52.693Z","3.0.0-pre.22":"2018-08-25T03:23:01.952Z","3.0.0-pre.23":"2018-08-25T07:15:30.295Z","3.0.0-pre.24":"2018-08-29T02:48:59.998Z","3.0.0-pre.25":"2018-08-29T04:02:14.621Z","3.0.0-pre.26":"2018-09-12T01:14:24.968Z","3.0.0":"2018-09-13T04:01:20.509Z","3.0.1":"2018-09-14T02:42:05.999Z"},"bugs":{"url":"https://github.com/PolymerElements/app-media/issues"},"author":{"name":"The Polymer Authors"},"license":"BSD-3-Clause","homepage":"https://github.com/PolymerElements/app-media","keywords":["web-components","web-component","polymer","app","user","media","stream","camera","audio","video"],"repository":{"url":"git://github.com/PolymerElements/app-media.git","type":"git"},"description":"Elements for accessing data from media input devices","maintainers":[{"email":"rictic@gmail.com","name":"rictic"},{"email":"aomarks@gmail.com","name":"aomarks"},{"email":"emarquez@google.com","name":"emarquez"},{"email":"sorvell@google.com","name":"sorvell"},{"email":"bicknellr@gmail.com","name":"bicknellr"},{"email":"brendan@usergenic.com","name":"usergenic"},{"email":"admin@polymer-project.org","name":"polymer-devs"},{"email":"dfreedm2@gmail.com","name":"azakus"},{"email":"kevinpschaaf@gmail.com","name":"kevinpschaaf"},{"email":"justin@fagnani.com","name":"justinfagnani"}],"readme":"[![Published on NPM](https://img.shields.io/npm/v/@polymer/app-media.svg)](https://www.npmjs.com/package/@polymer/app-media)\n[![Build status](https://travis-ci.org/PolymerElements/app-media.svg?branch=master)](https://travis-ci.org/PolymerElements/app-media)\n[![Published on webcomponents.org](https://img.shields.io/badge/webcomponents.org-published-blue.svg)](https://webcomponents.org/element/@polymer/app-media)\n\n## App Media Elements\n\nElements for accessing data from media input devices, such as cameras and\nmicrophones, and visualizing that data for users.\n\n_See [the full explainer](./explainer.md) for detailed usage._\n\nSee: [Documentation](https://www.webcomponents.org/element/@polymer/app-media),\n  [Demo](https://www.webcomponents.org/element/@polymer/app-media/demo/demo/index.html).\n\n### Browser support\n\nThe following emerging platform APIs are used by this collection of elements:\n\n - [Media Capture and Streams](https://www.w3.org/TR/mediacapture-streams/)\n - [MediaStream Recording](https://www.w3.org/TR/mediastream-recording/)\n - [Web Audio API](https://www.w3.org/TR/webaudio/)\n - [MediaStream Image Capture](https://w3c.github.io/mediacapture-image/)\n\nSome additional browser support is enabled by\n[WebRTC polyfill](https://github.com/webrtc/adapter) and\n[MediaStream ImageCapture API polyfill](https://github.com/GoogleChromeLabs/imagecapture-polyfill).\nThe following table documents browser support for the elements in this collection with\nthese polyfills in use\n\n - ✅ Stable native implementation\n - 🚧 Partial fidelity with polyfill\n - 🚫 Not supported at all\n\nElement                   | Chrome | Safari 11 | Firefox | Edge  | IE 11\n--------------------------|--------|-----------|---------|-------|------\n`app-media-video`         |     ✅ |        ✅ |      ✅ |    ✅ |    ✅\n`app-media-audio`         |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-waveform`      |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-devices`       |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-stream`        |     ✅ |        ✅ |      ✅ |    ✅ |    🚫\n`app-media-recorder`      |     ✅ |        🚫 |      ✅ |    🚫 |    🚫\n`app-media-image-capture` |     ✅ |        🚧 |      🚧 |    🚧 |    🚫\n\n## Usage\n\n### Installation\n\nElement:\n```\nnpm install --save @polymer/app-media\n```\n\nPolyfills:\n```\nnpm install --save webrtc-adapter\nnpm install --save image-capture\n```\n\n### In an HTML file\n\n##### `<app-media-devices>`\n\n```html\n<html>\n  <head>\n    <script type=\"module\">\n      import '@polymer/app-media/app-media-devices.js';\n    </script>\n  </head>\n  <body>\n    <app-media-devices\n        kind=\"audioinput\"\n        devices=\"{{microphones}}\">\n    </app-media-devices>\n  </body>\n</html>\n```\n\n##### `<app-media-stream>`\n\n```html\n<html>\n  <head>\n    <script type=\"module\">\n      import '@polymer/polymer/lib/elements/dom-bind.js';\n      import '@polymer/app-media/app-media-devices.js';\n      import '@polymer/app-media/app-media-stream.js';\n    </script>\n  </head>\n  <body>\n    <dom-bind>\n      <template>\n        <app-media-devices\n            kind=\"audioinput\"\n            devices=\"{{microphone}}\">\n        </app-media-devices>\n        <app-media-stream\n            audio-device=\"[[microphone]]\"\n            stream=\"{{microphoneStream}}\">\n        </app-media-stream>\n      </template>\n    </dom-bind>\n  </body>\n</html>\n```\n\n#### `<app-media-video>`\n\n```html\n<html>\n  <head>\n    <script type=\"module\">\n      import '@polymer/polymer/lib/elements/dom-bind.js';\n      import '@polymer/app-media/app-media-devices.js';\n      import '@polymer/app-media/app-media-stream.js';\n      import '@polymer/app-media/app-media-video.js';\n    </script>\n  </head>\n  <body>\n    <dom-bind>\n      <template>\n        <app-media-devices\n            kind=\"videoinput\"\n            devices=\"{{camera}}\">\n        </app-media-devices>\n        <app-media-stream\n            video-device=\"[[camera]]\"\n            stream=\"{{cameraStream}}\">\n        </app-media-stream>\n        <app-media-video\n            source=\"[[cameraStream]]\"\n            autoplay>\n        </app-media-video>\n      </template>\n    </dom-bind>\n  </body>\n</html>\n```\n\n#### `<app-media-recorder>`\n\n```html\n<html>\n  <head>\n    <script type=\"module\">\n      import '@polymer/polymer/lib/elements/dom-bind.js';\n      import '@polymer/app-media/app-media-devices.js';\n      import '@polymer/app-media/app-media-stream.js';\n      import '@polymer/app-media/app-media-recorder.js';\n    </script>\n  </head>\n  <body>\n    <dom-bind>\n      <template>\n        <app-media-devices\n            kind=\"videoinput\"\n            devices=\"{{camera}}\">\n        </app-media-devices>\n        <app-media-devices\n            kind=\"audioinput\"\n            devices=\"{{microphone}}\">\n        </app-media-devices>\n        <app-media-stream\n            video-device=\"[[camera]]\"\n            audio-device=\"[[microphone]]\"\n            stream=\"{{cameraAndMicrophoneStream}}\">\n        </app-media-stream>\n        <app-media-recorder\n            id=\"recorder\"\n            stream=\"[[cameraAndMicrophoneStream]]\"\n            data=\"{{recordedVideo}}\"\n            duration=\"3000\">\n        </app-media-recorder>\n      </template>\n    </dom-bind>\n    <script>\n      function createRecording() {\n        recorder.start();\n      }\n    </script>\n  </body>\n</html>\n```\n\n##### `<app-media-image-capture>`\n\n```html\n<html>\n  <head>\n    <script type=\"module\">\n      import '@polymer/polymer/lib/elements/dom-bind.js';\n      import '@polymer/app-media/app-media-devices.js';\n      import '@polymer/app-media/app-media-stream.js';\n      import '@polymer/app-media/app-media-image-capture.js';\n    </script>\n  </head>\n  <body>\n    <dom-bind>\n      <template>\n        <app-media-devices\n            kind=\"videoinput\"\n            devices=\"{{camera}}\">\n        </app-media-devices>\n        <app-media-stream\n            video-device=\"[[camera]]\"\n            stream=\"{{videoStream}}\">\n        </app-media-stream>\n        <app-media-image-capture\n            id=\"imageCapture\"\n            stream=\"[[videoStream]]\"\n            focus-mode=\"single-shot\"\n            red-eye-reduction\n            last-photo=\"{{photo}}\">\n        </app-media-image-capture>\n      </template>\n    </dom-bind>\n    <script>\n      function takePhoto() {\n        imageCapture.takePhoto();\n      }\n    </script>\n  </body>\n</html>\n```\n\n#### `<app-media-audio>`\n\n```html\n<html>\n  <head>\n    <script type=\"module\">\n      import '@polymer/polymer/lib/elements/dom-bind.js';\n      import '@polymer/app-media/app-media-devices.js';\n      import '@polymer/app-media/app-media-stream.js';\n      import '@polymer/app-media/app-media-audio.js';\n    </script>\n  </head>\n  <body>\n    <dom-bind>\n      <template>\n        <app-media-devices\n            kind=\"videoinput\"\n            devices=\"{{camera}}\">\n        </app-media-devices>\n        <app-media-stream\n            video-device=\"[[camera]]\"\n            stream=\"{{videoStream}}\">\n        </app-media-stream>\n        <app-media-audio\n            stream=\"[[microphoneStream]]\"\n            analyser=\"{{microphoneAnalyzer}}\">\n        </app-media-audio>\n      </template>\n    </dom-bind>\n  </body>\n</html>\n```\n\n#### `<app-media-waveform>`\n\n```html\n<html>\n  <head>\n    <script type=\"module\">\n      import '@polymer/polymer/lib/elements/dom-bind.js';\n      import '@polymer/app-media/app-media-devices.js';\n      import '@polymer/app-media/app-media-stream.js';\n      import '@polymer/app-media/app-media-audio.js';\n      import '@polymer/app-media/app-media-waveform.js';\n    </script>\n  </head>\n  <body>\n    <dom-bind>\n      <template>\n        <app-media-devices\n            kind=\"videoinput\"\n            devices=\"{{camera}}\">\n        </app-media-devices>\n        <app-media-stream\n            video-device=\"[[camera]]\"\n            stream=\"{{videoStream}}\">\n        </app-media-stream>\n        <app-media-audio\n            stream=\"[[microphoneStream]]\"\n            analyser=\"{{microphoneAnalyzer}}\">\n        </app-media-audio>\n        <app-media-waveform analyser=\"[[microphoneAnalyzer]]\">\n        </app-media-waveform>\n      </template>\n    </dom-bind>\n  </body>\n</html>\n```\n\n### In a Polymer 3 element\n\n##### `<app-media-devices>`\n\n```js\nimport {PolymerElement, html} from '@polymer/polymer';\nimport '@polymer/app-media/app-media-devices.js';\n\nclass SampleElement extends PolymerElement {\n  static get template() {\n    return html`\n      <app-media-devices\n          kind=\"audioinput\"\n          devices=\"{{microphones}}\">\n      </app-media-devices>\n    `;\n  }\n}\ncustomElements.define('sample-element', SampleElement);\n```\n\n##### `<app-media-stream>`\n\n```js\nimport {PolymerElement, html} from '@polymer/polymer';\nimport '@polymer/app-media/app-media-devices.js';\nimport '@polymer/app-media/app-media-stream.js';\n\nclass SampleElement extends PolymerElement {\n  static get template() {\n    return html`\n      <app-media-devices\n          kind=\"audioinput\"\n          devices=\"{{microphone}}\">\n      </app-media-devices>\n      <app-media-stream\n          audio-device=\"[[microphone]]\"\n          stream=\"{{microphoneStream}}\">\n      </app-media-stream>\n    `;\n  }\n}\ncustomElements.define('sample-element', SampleElement);\n```\n\n#### `<app-media-video>`\n\n```js\nimport {PolymerElement, html} from '@polymer/polymer';\nimport '@polymer/app-media/app-media-devices.js';\nimport '@polymer/app-media/app-media-stream.js';\nimport '@polymer/app-media/app-media-video.js';\n\nclass SampleElement extends PolymerElement {\n  static get template() {\n    return html`\n      <app-media-devices\n          kind=\"videoinput\"\n          devices=\"{{camera}}\">\n      </app-media-devices>\n      <app-media-stream\n          video-device=\"[[camera]]\"\n          stream=\"{{cameraStream}}\">\n      </app-media-stream>\n      <app-media-video\n          source=\"[[cameraStream]]\"\n          autoplay>\n      </app-media-video>\n    `;\n  }\n}\ncustomElements.define('sample-element', SampleElement);\n```\n\n#### `<app-media-recorder>`\n\n```js\nimport {PolymerElement, html} from '@polymer/polymer';\nimport '@polymer/app-media/app-media-devices.js';\nimport '@polymer/app-media/app-media-stream.js';\nimport '@polymer/app-media/app-media-recorder.js';\n\nclass SampleElement extends PolymerElement {\n  static get template() {\n    return html`\n      <app-media-devices\n          kind=\"videoinput\"\n          devices=\"{{camera}}\">\n      </app-media-devices>\n      <app-media-devices\n          kind=\"audioinput\"\n          devices=\"{{microphone}}\">\n      </app-media-devices>\n      <app-media-stream\n          video-device=\"[[camera]]\"\n          audio-device=\"[[microphone]]\"\n          stream=\"{{cameraAndMicrophoneStream}}\">\n      </app-media-stream>\n      <app-media-recorder\n          id=\"recorder\"\n          stream=\"[[cameraAndMicrophoneStream]]\"\n          data=\"{{recordedVideo}}\"\n          duration=\"3000\">\n      </app-media-recorder>\n    `;\n  }\n\n  createRecording() {\n    this.$.recorder.start();\n  }\n}\ncustomElements.define('sample-element', SampleElement);\n```\n\n##### `<app-media-image-capture>`\n\n```js\nimport {PolymerElement, html} from '@polymer/polymer';\nimport '@polymer/app-media/app-media-devices.js';\nimport '@polymer/app-media/app-media-stream.js';\nimport '@polymer/app-media/app-media-image-capture.js';\n\nclass SampleElement extends PolymerElement {\n  static get template() {\n    return html`\n      <app-media-devices\n          kind=\"videoinput\"\n          devices=\"{{camera}}\">\n      </app-media-devices>\n      <app-media-stream\n          video-device=\"[[camera]]\"\n          stream=\"{{videoStream}}\">\n      </app-media-stream>\n      <app-media-image-capture\n          id=\"imageCapture\"\n          stream=\"[[videoStream]]\"\n          focus-mode=\"single-shot\"\n          red-eye-reduction\n          last-photo=\"{{photo}}\">\n      </app-media-image-capture>\n    `;\n  }\n\n  takePhoto() {\n    this.$.imageCapture.takePhoto();\n  }\n}\ncustomElements.define('sample-element', SampleElement);\n```\n\n#### `<app-media-audio>`\n\n```js\nimport {PolymerElement, html} from '@polymer/polymer';\nimport '@polymer/app-media/app-media-devices.js';\nimport '@polymer/app-media/app-media-stream.js';\nimport '@polymer/app-media/app-media-audio.js';\n\nclass SampleElement extends PolymerElement {\n  static get template() {\n    return html`\n      <app-media-devices\n          kind=\"videoinput\"\n          devices=\"{{camera}}\">\n      </app-media-devices>\n      <app-media-stream\n          video-device=\"[[camera]]\"\n          stream=\"{{videoStream}}\">\n      </app-media-stream>\n      <app-media-audio\n          stream=\"[[microphoneStream]]\"\n          analyser=\"{{microphoneAnalyzer}}\">\n      </app-media-audio>\n    `;\n  }\n}\ncustomElements.define('sample-element', SampleElement);\n```\n\n#### `<app-media-waveform>`\n\n```js\nimport {PolymerElement, html} from '@polymer/polymer';\nimport '@polymer/app-media/app-media-devices.js';\nimport '@polymer/app-media/app-media-stream.js';\nimport '@polymer/app-media/app-media-audio.js';\nimport '@polymer/app-media/app-media-waveform.js';\n\nclass SampleElement extends PolymerElement {\n  static get template() {\n    return html`\n      <app-media-devices\n          kind=\"videoinput\"\n          devices=\"{{camera}}\">\n      </app-media-devices>\n      <app-media-stream\n          video-device=\"[[camera]]\"\n          stream=\"{{videoStream}}\">\n      </app-media-stream>\n      <app-media-audio\n          stream=\"[[microphoneStream]]\"\n          analyser=\"{{microphoneAnalyzer}}\">\n      </app-media-audio>\n      <app-media-waveform analyser=\"[[microphoneAnalyzer]]\">\n      </app-media-waveform>\n    `;\n  }\n}\ncustomElements.define('sample-element', SampleElement);\n```\n\n## Contributing\nIf you want to send a PR to this element, here are\nthe instructions for running the tests and demo locally:\n\n### Installation\n```sh\ngit clone https://github.com/PolymerElements/app-media\ncd app-media\nnpm install\nnpm install -g polymer-cli\n```\n\n### Running the demo locally\n```sh\npolymer serve --npm\nopen http://127.0.0.1:<port>/demo/\n```\n\n### Running the tests\n```sh\npolymer test --npm\n```","readmeFilename":"README.md"}