Watch Adobe’s Project Wetbrush Oil-Paint Simulator in Action

Ever wish your digital painting could really have the look of oil on canvas? Well, the next generation of Adobe tools — with a hefty assist from powerful GPUs from companies like Nvidia — might get your digital work a little closer to the real thing.

At a SIGGRAPH press briefing yesterday, Nvidia showed off a demo of "Project Wetbrush," an Adobe technology that simulates oil painting in the digital realm. It's not just a glorified Photoshop texture tool either — Wetbrush actually computes the effects of the collision between the brush tip, the glob of paint, and any paint that has already been applied to the canvas.

 

Please enable Javascript to watch this video

The results are remarkable, but hard to describe. Pardon the mediocre cell-phone video, but we grabbed this glimpse of Wetbrush in action to give you a much better idea of what the technology can do.

The upshot, Nvidia reps said, was that masterworks created in Wetbrush could ultimately be 3D-printed in a process that would have not only the color values of the oil painting, but also the depth information that would give it its texture.

Of course, Project Wetbrush is likely pretty far from being incorporated into an actual Adobe product. Think of it as yet another Cool Thing you first heard about at SIGGRAPH that may, or may not, eventually make it to market.

For more on Wetbrush, see Nvidia's blog or look for a demo at Nvidia's SIGGRAPH booth (#509).

Categories: Technology  |  Tags: , , ,  |  Comments

Eight Shared Storage Tips from Citizen Pictures, StorExcel and Quantum

citizen-pictures

Denver's Citizen Pictures recently became one of the first facilities to install a new Quantum Xcellis storage system, replacing an aging Xsan set up. During a StudioDaily webinar yesterday sponsored by Quantum, Andrew Moraski, post-production supervisor at Citizen, Lance Hukill, president of systems integrator StorExcel, and Janet Lafleur, StorNext product marketing manager for Quantum, discussed Citizen's move as well as general operating principles for effective shared storage infrastructure. You can still view the entire webinar on demand, but we've summarized some key takeaways below.

1. Old Xsan systems can often be upgraded in place, with extensions as needed.
"We added a little bit of disk to expand out. If we're doing a proxy workflow for a show that has six angles, or a normal show with two or three cameras where we're editing full-res, we just added that extra storage to give us that flexibility." — Andrew Moraski

2. Everyone knows Ultra HD is four times the size of HD — but don't forget about HDR. "HDR adds to the load over and above whatever you're doing for 4K. You go from 8 bit to 10 bit, so that's 25% more space you require." — Janet Lafleur

3. Hybrid SAN/NAS workflow has benefits. "Some of the processes, like ingest, might need a SAN-based type of platform. When you're doing proxy-based editing, a NAS-based platform is sufficient. Having a Quantum Xcellis-type of converged architecture allows you the extensibility to do both approaches." — Lance Hukill

4. Security measures should include Open Directory and LDAP at the least. "I'm still surprised that a lot of customers aren't using Open Directory or Active Directory within the production environment. That needs to become more general hygiene or best practice — that you should use real authentication techniques just to have your own internal users know who should have access to what files. Whether it's Open Directory or LDAP, that kind of authentication schema is critical." — LH

5. It's never too early to think about asset management, even if you haven't yet implemented a media asset management (MAM) solution. "If you're not doing asset management, hopefully you have a very structured project folder nomenclature, where even the project names and file names have, somewhat, embedded metadata. If and when you do go to an asset-managed platform, that transition — to map the content and metadata fields and project structures — can more easily go into the platform you end up choosing." — LH

6. You can have easy access to your archive even without using a MAM. "A smart file system can offer transparent access to the archive, like StorNext does with its file system. Even if you're not using a MAM you can still access a file that's on tape or in object storage in the cloud exactly as you would having it online storage. The only difference is there's some latency involved with tape, but it's still seamless." — JL

7. You may not need back-up software or a long-term archive, but you do need to protect raw footage. "Our product has the ability to immediately, as soon as new footage comes aboard, archive it — basically, make a copy somewhere else, so that it's secure. It's fixed content. It's never going to change, so you don't need to keep checking it with any kind of backup software. So a solution that really looks at how to not just preserve the content long term but protect it the moment it enters your workflow is something to look for." — JL

8. Regularly sending files to off-site storage is a good safeguard for disaster recovery. "When shooting in the field … we transfer to two sets of drives, and when they come in we send one to off-site storage and then the other one gets put onto the SAN and backed up nightly, as well. We have multiple copies of content everywhere, so if anything was to happen, whether it's a building issue or a deletion issue, we have copies elsewhere and we can continue to crank out content." — AM

Watch this webinar on demand.

Categories: Technology  |  Tags: , ,  |  Comments

Five Things Pokémon Go Shows Us About the Future of Media

Pokémon Go AR

If you work in a big city, you need only step outside during your lunch break to see the effects of the first true AR craze. Odds are, you'll see someone wandering around, eyes glued to phone, looking to add a new Pokemon to their collection or to rustle up some Pokéballs at a nearby Pokéstop. Launched just last week for iOS and Android phones, the latest installment of the Nintendo-originated videogame franchise has become a cultural sensation, sending gamers off the couches and into the streets to travel through a fantasy universe layered over the top of a Google Maps-like representation of the real world. By some accounts, Pokémon Go is on track to become more widely used than Twitter in less than a week's time on the market. Here's what the wild success of Pokemon Go may portend for the media market. 

1. Augmented reality is potentially a big business. A very big business. AR's bigger brother, virtual reality, gets most of the press thanks to its more immersive nature. But some observers expect the action to be in AR.  Market research consultant Digi-Capital, for instance, is projecting that by 2020, the combined AR/VR market will reach $120 billion, with AR making up about $90 billion of the total. Goldman Sachs is more bearish, predicting a combined AR/VR market of anywhere from $15 billion on the low end to $110 billion on the high end.

2. Runaway success may be the tricky part.  If your AR application has a networked component, you need to make sure demand won't outstrip your service capabilities and melt your servers. Pokémon Go players have been reporting intermittent service outages, and developer Niantic has already had to slow the app's international expansion due to server load issues. 

3. Privacy issues are no joke. Pokémon Go has already had its first scandal. The application apparently dramatically overstepped its bounds by requesting full access to users' Google accounts. Eager Pokémon hunters were only too happy to comply, meaning the game theoretically had read access to all of their Google account data, including email. Ars Technica called it a "possible privacy trainwreck," but Wired reports that the app has been updated to fix the issue earlier today. So be careful if you're collecting data about user's activities in the real world, and their real identities, lest you turn an army of privacy advocates against you right out of the gate.

4. Familiar franchises have a lot of leverage in a brave new media world.  It's no surprise that the first killer AR app is based on a beloved franchise, just as it makes sense that the most buzzed-about application for the forthcoming PlayStation VR is the Star Wars Battlefront VR edition. "Battlefront is going to be one of those games that will really show gamers what it means to be in the world of VR," as Sony Interactive Entertainment VP of marketing John Koller told Fortune. It's safe to say that many early consumer AR and VR sensations will benefit from built-in nostalgia and/or nerd appeal.

5. AR hardware is already in your pocket. Fancy AR hardware like Microsoft's Hololens or Meta's latest development kit or whatever it is that Magic Leap is getting ready to unleash on the world is still a ways down the road. But smartphones that can run lightweight, 2D AR experiences like Pokémon Go are tiny and ubiquitous. That's one reason AR has a head start on VR, which requires a pricey headset and a hefty desktop computing rig to come close to being truly immersive.

Categories: General  |  Comments

Five Things to Know About the VFX in Independence Day: Resurgence

It took 20 years, but the Independence Day franchise returned last Friday to kick some landmark-destroying alien butt and dazzle audiences with a stellar variety of large-scale explosive visual effects. The new film, like the 1996 original, is VFX supervised by Volker Engel and features effects built and blended by some of the best facilities in the business.

1. The sequel has a monster load of effects shots—but not the most ever.

Independence Day featured 430 Oscar-winning visual effects shots, which at the time were some of the most sophisticated on screen. The sequel includes 1,750 effects shots by Uncharted Territory (run by Engel and Marc Weigert), Weta, MPC, Scanline, ImageEngine, Cinesite, Luxx Studios and Digital Domain. For comparison, that's more than 2013's Man of Steel yet just under the 2,100 shots ILM completed, with partner studios Virtuos, Hybride and Base FX, in Star Wars: The Force Awakens. But it's still only a bit more than half the VFX shots in Captain America: Civil War and Avengers: Age of Ultron. While studios may define digital effects shots differently and sheer numbers don't necessarily equal the most complex, no film has yet come close to the 4500+ shots reportedly created for the 2015 epic Bollywood film Baahubali: The Beginning.

Emmerich on Set

Emmerich on set

2. The secret to a more fluid VFX workflow? A barely-there VR camera tracker.

According to FX Guide's deep and exhaustive examination of the film's effects, the virtual production tracker Ncam was used for some 90 percent of the VFX shoot. Ncam uses a lightweight sensor that can be attached to any camera for an unfettered, continuous stream of super-precise positional and rotational data that flows directly into whatever VFX software and graphics engine is being used in post. Camera operators use the Ncam data when framing virtual sets and VFX supes rely on it to build final shots. Director Roland Emmerich thinks "this kind of system is the future" and although he found it took a certain amount of patience to learn to use it effectively on set, it is now his favorite tool.

3. Uncharted Territory did the most shots, but Scanline and Weta among the most complex.

Engel and his team at Uncharted Territory served as the hub for all the effects in the movie and worked closely with Emmerich from start to finish. But Scanline, known for its proprietary Sci-Tech Oscar-winning fluid effects software Flowline that makes quick work of water-based VFX while also making them breathtakingly real, had the gargantuan task of figuring out how to visualize a 3,000-mile-wide alien spacecraft as it descended through earth's atmosphere (at top). Weta pulled out all the stops for the final bus-in-the-dust vs. galloping alien chase sequence.

MPC green beam

4. A green beam for the new era.

The iconic green beam that destroyed the White House in the original film is back with a vengeance. We first see it in the opening shots set on the moon. VFX supervisor Sue Rowe and her team at MPC worked alongside Engel to bring geological fidelity to every simulation of the lunar landscape using concept art, NASA images taken through the Hubble Telescope and close study of the moon’s surface. It's one reason the eventual destruction of the organic surfaces look so real. When the alien Mother Ship strikes back, the familiar laser—this time created with RealFlow and Houdini—takes it all down in a flash. MPC also created full CG digital doubles of the actors in their space suits for the final moments of destruction from detailed scans and photographs. 

5. This is the fourth film for Roland Emmerich that VFX supe Volker Engel has also produced.

Volker was Co-Producer and VFX Supervisor (with Marc Weigert) for Emmerich's White House Down (2013) and 2012 (2009), and was Executive Producer and VFX Supervisor for the director's 2011 film Anonymous. He co-produced Independence Day: Resurgence. The additional credits were apparently brokered by Emmerich with the studios as a way to give credit where it was long overdue.

It’s Adobe Day: The New Creative Cloud Applications Are Out

Adobe today launched new versions of applications in Creative Cloud, including Premiere Pro, After Effects, Media Encoder, Audition and Photoshop.

As we previously reported, chief among the new features are new capabilities in Premiere's Lumetri Color Panel (including HSL secondaries), performance enhancements in After Effects, and improvements to the ever-intriguing Character Animator. (That latter is still officially a "preview" release, which seems odd for software that's already been used to generate several minutes of live prime-time programming for a major television network.) Audition now includes some options giving editors access to useful combinations of filters and effects without requiring them to tune the stacks from scratch, and Media Encoder comes into play as part of an end-to-end refinement to queueing, rendering and proxy workflow.

One thing to be careful of — Adobe says that projects saved in v2015.3 of its applications cannot be opened in earlier versions. So if you think you may need to open current projects on systems running earlier versions of Creative Cloud apps for some reason, this might be an update you want to skip until you're sure all of the workstations you use are on the same page.

The downloads are available through your desktop Creative Cloud app. Here's a video with Adobe's Dave Helmly introducing the new features in Premiere Pro, followed by Adobe's official what's-new indexes for both Premiere Pro and After Effects. 


Adobe Premiere Pro CC

Performance optimizations

  • Apple Metal GPU supported by Mercury Playback Engine (Initial support – some effects not currently supported)
  • Native decode (i.e. no QuickTime installation required) on Windows of Apple ProRes
  • Improvements to MorphCut face detection and tracking
  • H264 playback GPU acceleration (Windows platform with Intel IRIS chipsets only)

Enhancements to the editing experience

  • Import, edit, and create Open Captions (subtitles), including options for font, size, position, color, weight, background color and background opacity
  • In-application licensing of Adobe Stock assets from Project Panel or Timeline
  • New badge in Project Panel for unlicensed stock assets
  • Support for Arabic and Hebrew languages in the Titler
  • Timeline auto scroll when range selecting
  • See more tracks at once with smaller minimum timeline height
  • Remove Attributes to remove individual effects from clips
  • Newly assignable keyboard shortcuts:
    • Add/remove a keyframe in the Effect Control Panel
    • Nudge keyframe left or right by one frame
    • Select next/previous keyframe
    • Increase/decrease keyframe value
    • Constrain Direct Manipulation horizontally or vertically while dragging
    • Toggle timeline zoom to single frame level
  • Twirl state of intrinsic effects shared between all clips in Effect Control Panel is remembered
  • Twirl state of parameters in Export Settings dialog is remembered
  • Show all clip markers in a sequence in the Markers panel
  • Filter markers in the Marker panel by color
  • Frame count offset updates dynamically while trimming with the mouse
  • FCP XML time remapping (speed ramps) support
  • Adjust multiple clips’ field options simultaneously
  • Create dedicated folders for each Scratch Disk file Premiere generates
  • Multiple fixes to known issues with voice-over recording using Mercury Transmit

New and improved format support

  • AS-10 export
  • Direct export to XDCAM HD disc
  • HEVC 10-bit export
  • Improvements to J2K export (24p and 30p now supported)
  • Panasonic AVC-LongG export
  • RED Weapon 8K, RED Raven support
  • QT XDCAM HD to MXF XDCAM HD Smart Rendering
  • ‘Sony device compatibility’ checkbox added to XAVC export settings
  • Match Source controls for still image formats
  • Create separate mono channels for DNxHD exports

Source: blogs.adobe.com


Adobe After Effects CC

  • Enhanced video and audio playback: After Effects CC 2015 (13.8) uses a new playback architecture to deliver real-time playback of cached frames with synced audio. The new architecture is shared with other Adobe applications, like Premiere Pro and Audition.
  • Effect rendering on the GPU: The Lumetri Color, Gaussian Blur, and Sharpen effects can now render using your computer’s GPU. This improves rendering performance for these effects by 2x-4x over rendering using only the CPU (depending on the frame being rendered and the speed of your GPU). GPU effect rendering is controlled via the new Video Rendering and Effects option in the Project Settings dialog.
  • Performance improvements: Many small changes under the hood include faster import and caching of image sequences, asynchronous drawing of viewer panels, faster opening of large projects, improved expression caching, and more.
  • Additional native format support: Apple ProRes QuickTime files can be decoded on Windows without needing QuickTime installed on the system. RED camera raw file decoding now supports RED Scarlet-W, Raven, and Weapon cameras, including 8K .r3d footage.
  • Lumetri Color effect improvements: The Lumetri Color effect can now render using your computer’s GPU, and includes new HSL Secondary controls and new SpeedLooks presets.
  • Gaussian Blur effect improvements: The Gaussian Blur effect has been updated to a new version. The Repeat Edge Pixels option from the Fast Blur effect has been added, and the effect can now render using your computer’s GPU. This version of Gaussian Blur replaces both the previous Gaussian Blur (Legacy) effect and the Fast Blur effect, which are still available but have been moved to the Obsolete category.
  • Add compositions to Adobe Media Encoder with render settings: You can now send compositions from the Render Queue to the Adobe Media Encoder queue with the options you choose in the Render Settings dialog. When you click the new Queue in AME button in the Render Queue panel, queued render  items are added to Adobe Media Encoder. Compositions will be rendered by Adobe Media Encoder with the render settings you chose in After Effects.
  • Read-only collaboration with Creative Cloud libraries: Assets in Creative Cloud libraries can be set to read-only, so they can be shared but not changed or deleted.
  • Libraries panel improvements: The new Libraries workspace makes it easier to search and add assets to your project. Find the right content fast using filtered search for Adobe Stock, display of length and format for Stock video, and links to video previews.
  • Scroll panel tabs using the Hand tool: When a panel group has more tabs than you can see at once, you can now scroll the tabs by panning with the Hand tool. For example, hold the spacebar key to activate the Hand tool, then click in the tab well and drag left or right, the same way you would to pan in the Composition panel.
  • Maxon CINEWARE 3.1: The latest version of Maxon’s CINEWARE plug-in includes bug fixes, enhanced OpenGL rendering, and scene coordinate matching.
  • Maxon Cinema 4D Exporter: The Maxon Cinema 4D Exporter now exports animated 3D text and shape layers into the .c4d file. 3D text layers can be exported as extruded spline objects that retain animation fidelity, or as extruded text objects that preserve the ability to edit the text in Cinema 4D.
  • And even more: Many additional small improvements, such as a Swap Colors button in the Tint effect, refinements to project auto-save, and many bug fixes.

Source: blogs.adobe.com

More details on all of the application updates are available at the Creative Cloud blog.


View Archive »