Before you get started, read the overview on our cutting machines to make sure you know which machine best suits your project and to learn what you need to do before using the machine.
If you want to learn more, please visit our website.
To get started with the Cutter/Plotter, read or review the information below, then walk in during our open hours in Rose Library.
If you haven’t used Adobe Illustrator before, expect to spend at least an hour learning how to complete some basic tasks with it before working with the cutter/plotter. As always, assistance and materials are provided by Makery staff. Additionally, we regularly hold workshops on how to use Adobe Illustrator.
The Graphtec cutting plotter is directly connected to one of the workstation computers in the Rose Library makerspace. There is a dedicated plugin within Adobe Illustrator used to send designs to the machine. You have your choice of any of the colors or varieties of vinyl we have available.
To cut a design onto one of these materials, follow the steps below:
Once the image has been traced and expanded to have paths, the image is ready to cut. To cut the image, you will use the Cutting Master plugin.
Make sure you are in the main selection tool. Select everything that you want to cut.
TIP: If you have multiple images traced in the workspace, highlight only what you would like to cut out first. You will have to be in the “direct selection” tool to select smaller portions of the image. Selecting each piece will allow you to adjust each one individually within Cutting Master.
Once you have your image selected, go to file > cutting master 5 > send to cutting master 5
A new window will pop up. This is cutting master 5. It is the plug-in between illustrator and the cutter/plotter. Your design should appear in the new window! Double check it is loaded into the software correctly and looks like how you want it to turn out. Under the first tab you will find options to resize, rotate, and mirror.
Tip: to keep the ratio of your design consistent when re-sizing, ensure the “proportional” box is checked
You can also choose how many copies by going to the 5th tab.
If you are using heat transfer vinyl for ironing onto fabric, the most important thing to remember is to flip the image, so that it is mirrored backwards. This will ensure that the image is facing the right way when you iron it on. You can do this by clicking the dropdown box next to “mirror” and selecting the mirror plane you prefer.
When you are ready and the machine is loaded, hit “output” and the machine will begin cutting. Be sure to monitor the cutting process to ensure your design is being executed correctly.
Here are a few descriptions of what each configuration means if you are interested:
Copies: The number of times you want your design to cut.
Media Size: The maximum cutting length of a vinyl cutter determines how long of a piece of the media the cutter can move.
Job Size: The size of the design itself.
Fit to Media: This button you will use to ensure that the design fits to the media size before starting the cut.
Proportional: This tool will lock the sizing of the design. Therefore, if you change one length, it will automatically change the other length.
Centering: The centering tool aligns your design in the middle of the media.
Rotation: Allows you to turn your design on the media.
Mirror: This button is used to invert the design. You will use this button when working with heat transfer vinyl.
Weeding vinyl is the process of manually removing excess vinyl from around your design, leaving only the parts that you want to transfer onto your final surface.
As mentioned before, heat transfer vinyl is vinyl that may be ironed onto fabric. The higher the natural fiber content, the better the design adheres. Here are the steps:
Expect to spend at least an hour learning how to complete some basic tasks with Adobe Illustrator. The cutter/plotter is available for use just by walking in during our operating hours.
What is Adobe Illustrator?Adobe Illustrator is a vector drawing software that is primarily used for graphics.
What is a vector image?While the most common image format, a “bitmap” or “raster” image is a grid of pixels, a vector image is a collection of instructions: “draw a line 10 units long, then turn 90 degrees, then draw another line…”
Because a vector describes a series of actions, rather than just a collection of dots, it can be scaled to infinity (and beyond!) without losing quality.
What software is used to get images prepared for the vinyl cutter?Adobe Illustrator, along with a plug-in for Illustrator called “Cutting Master”.
Do I have to purchase Adobe Illustrator?No! We provide use of the Adobe products on the computers in our space. You do need to set up an account with Adobe.
How do I sign back out of Illustrator?In the top taskbar, click on the option “Help.” This will open a dropdown menu. At the bottom, hit “Sign Out… ()” and then hit “Sign Out” on the pop-up window.
Some of the most common problems that arise with projects are listed below, along with basic steps to take in order to diagnose and potentially fix the issue:
“The machine cut lines that aren’t in my original image!”Cutter/plotter machines do not “see” images the same way we do—both the Cricut and the Graphtec are designed to follow the vector paths provided in the file and largely ignore things like fill color and stroke width. In general, overlapping shapes are the cause of any discrepancy between how the image appears on a screen and how it is interpreted by the cutter/plotter. If you would like to see a preview of how the software will interpret a given design, you can do so by entering Outline View in Adobe Illustrator with the keyboard shortcut “Command + Y”. Each line visible in this mode will be cut or drawn when the design is sent to be processed. To return to the standard view, use the same keyboard shortcut.
In general, the fastest way to convert a layered image to a cut-ready format is to select all of the artwork and use the “Object > Expand…” option from the top menu bar to update all underlying vector paths to match what is visible, and then use the “Merge” setting in the Pathfinder Panel (“Window > Pathfinder” in the top menu bar) to flatten the image. This is a destructive edit which CANNOT be undone if saved. Be sure to make a copy of this edit as a new file so you do not overwrite the original design.
“I pressed send, but the Graphtec didn’t start cutting my design!”Any time the material is changed on the Graphtec cutter/plotter, the machine must be updated with the type of material loaded (i.e. a roll or a sheet). If the Graphtec is turned on and is connected to the computer but still will not cut, check the LCD screen to make sure that it is not waiting for user input. If it is, choose the appropriate option (roll or sheet). The Graphtec stores all pending jobs in its memory and will begin cutting the sent design as soon as it is done measuring the loaded material. If the design was sent multiple times, it will now cut multiple times.
If this does not fix the issue, turn off the machine and restart the process.
“The machine cut through the backing layers of the vinyl!”If the Cricut is cutting through the backing layer of a two-layer material, it has likely been set up with the wrong material settings. Double check that the material selected in the Design Space software matches the material being used. If this does not solve the issue, change the pressure setting from “Default” to “Less.”
The Graphtec machine is manually calibrated to cut through only one layer of the Oracal 651 vinyl. If you encounter issues with this machine, please consult with a manager.
Virtual productions that use LED volumes have skyrocketed in popularity over the past few years, due in large part to two catalysts: The Mandalorian and the pandemic.
The white-hot popularity of these shows, and the tools that create them, is driving adoption and innovation at an unprecedented pace. That’s tremendously exciting for filmmakers, because it opens up more creative possibilities.
But it also means you’re more likely than ever to encounter these cutting-edge tools. So it’s time to get up to speed, because it’s quite likely many of us will be involved with virtual productions soon.
In today’s article, we’ll walk through 10 essential tips and tricks you’ll need to achieve the best results using an LED volume.
If you’ve watched the The Mandalorian, you’ve already seen the power of in-camera visual effects using LED volumes and virtual production.
Put simply, the technique combines real-time game engine animation using Unreal Engine along with LED screens and tracked cameras to deliver visual effects shots live and in-camera.
Instead of in real-world locations or in front of a sea of green screens, actors perform in front of a giant volume of LED panels with animated backgrounds that respond to the camera’s movement with accurate parallax.
The result is completed visual effects shots on production with perfectly matched lighting and camera movements.
Having the entire crew see what the shot looks like instead of making an educated guess on a green screen means actors can respond to what they actually see and cinematographers can frame and light with certainty.
The entire team sees the finality of a shot on the day of production instead of many months later in post-production, when reshooting may no longer be an option.
Lucasfilm and ILM developed The Mandalorian in , and it premiered in November . Though the show was the first major series to tie together camera tracking with game engine animation and LED panels, the concept builds on work created in prior films.
Movies that used LED or projection technology to capture imagery in-camera include Oblivion, Gravity, Rogue One, First Man, etc.
When The Mandalorian debuted on Disney+ in November , it set the industry on its ear and led many other producers to look to the technology for their productions.
And then the pandemic happened.
Virtual production with LED walls would have become popular on its own without the significant change in world circumstances. But the technical and operational impact of this technology, which dramatically reduces the need for a large crew footprint and eliminates the need for location work and travel, cannot be overstated.
Major productions that would have been shot in real-world locations or on green screens suddenly were reconfigured to be partially or entirely shot on LED volumes instead. These include Star Trek: Discovery, Star Trek: Thor: Love and Thunder, and Bullet Train.
As of this writing, there are over 120 major LED volume studio facilities across the globe, and that number quickly increasing.
So, it’s less a question of if you’ll ever find yourself shooting on one of these LED volumes but when.
Now we have a solid background of how LED volumes came about and why they’ve become so prominent lately
As incredible as these LED volumes are, as with any other sufficiently technically complex production method, they’re not quite plug-and-play.
Let’s dive into ten things you need to know to make the most out of shooting on one of these stages.
Anyone who’s spent some time on a set or in an edit bay has heard the term “fix it in post.”
We tend to kick the can down the road during production when time is often short and expensive, and therefore perfection is not always the goal on the day. Factor in green-screen shots, and fixing it in post often becomes the only choice available.
That’s where in-camera VFX virtual production differs significantly and has come to be associated with the term, “fix it in pre.”
You can provide a shot live on the day that looks as good as a final composited shot using post-rendered animation. This has many benefits, especially in reducing the timeframe and costs associated with an effect-heavy project during post-production.
That means you should consider the major creative decisions and final quality imagery as things you should achieve during pre-production instead of deferring them to post-production.
As with any great advance, the idea of fixing it in pre can be a double-edged sword. Many seasoned filmmakers aren’t accustomed to the idea of making every decision in terms of effects imagery before production occurs and may find the process counterintuitive.
Indeed, schedules for films are typically back-loaded with less pre-production time and a more extended post period. To make an LED volume perform its best, this paradigm is inverted, with the lion’s share of visual development occurring in pre-production.
Assets such as models, characters, 3D environments, etc., must be completely camera-ready before production starts.
Along the way, this also means a lot more iteration and visual development can occur. Indeed, the Virtual Art Department, previsualization, and virtual scouting are all vital parts of the LED volume pre-production workflow. The Art Department, Camera Department, Production Design, Visual Effects Department, and other teams are all engaged during this period as the shared creative vision takes shape.
In many ways, the production day becomes about executing a carefully validated plan instead of best guess shots in the dark, as non-virtual productions often seem.
Master the art of fixing it in pre, and you will have a successful virtual production.
The Virtual Production Supervisor and the Virtual Art Department or VAD may be the newest aspects of filmmaking you’ve dealt with, so let’s understand what each means.
The Virtual Production Supervisor acts as the liaison between the physical production team, the Art Department, the Visual Effects Department, and the brain bar (or Volume Control Team).
Think of them as a combination of Visual Effects Supervisor and Art Director.
Their responsibilities include overseeing the Virtual Art Department during pre-production and supervising the LED volume itself during production. If you need something magical to appear on the LED volume, the Virtual Production Supervisor should be your first point of contact.
The VAD is where all elements that ultimately wind up on the LED walls are designed and created. This area encompasses a traditional art department, with an emphasis on digital assets.
Thanks to the power of LED volumes and 3D prototyping, environments, and props move fluidly between real-world physical objects and onscreen virtual objects.
If you look at an episode of The Mandalorian, you’ll start to notice certain props exist in the foreground as characters touch and stand on them. But they may also end up in backgrounds as part of the LED imagery.
The VAD is constantly creating objects which may be digital models, real-world props, or both.
Understanding what the Virtual Production Supervisor and the VAD do in a virtual production is essential. Finding experienced collaborators for these positions will make or break your virtual production.
So choose your team carefully.
Here’s a technical term for you: “videogamey.”
Basically, this is what we call it when an LED environment doesn’t look believable to the camera. It looks like a videogame instead of a photorealistic background.
You’ll want to avoid this at all costs and make imagery that looks as good as a real set or location.
That’s why photogrammetry is such a valuable technique for filmmakers. It’s what ILM uses to make many of its incredible environments on The Mandalorian. And it’s not that tricky to use.
Photogrammetry is the extraction of three-dimensional measurements from two-dimensional data. It’s a technique that’s been around for decades in the sense of collaging satellite imagery together for maps and is often used in architecture and construction.
Photogrammetric tools for virtual production such as RealityCapture use machine learning and AI techniques to analyze the geometry and overlapping features in a series of still images taken of a subject. ILM uses its photogrammetry techniques to scan real locations and physical models to create 3D assets for use in the volume.
In addition to photogrammetry, you’ll also want to take advantage of Epic Games’ free Quixel Megascan Library. It’s chock full of 3D assets, textures, and environments created with, you guessed it, photogrammetric techniques.
If you want to learn more, please visit our website Microtreat.
When in doubt about the reality of a particular asset, you can also take advantage of the built-in tools of Unreal Engine like baking and Lightmass Global Illumination.
What’s great about photogrammetry as a technique is it’s relatively straightforward. Snap many stills of the item or environment you want to create and use that as the basis for your model.
The effort needed to create a photorealistic 3D asset from photogrammetry is often far less than making the equivalent from scratch digitally.
Photogrammetry is a great way to avoid videogaminess on your LED virtual production.
Much of Unreal Engine’s visual mojo comes from leveraging the incredible power of modern GPUs or graphics processing units.
The more powerful a GPU you have to work with, the greater the level of detail in an environment you can have on your LED wall in real-time.
That’s to say, it’s not possible to have too much GPU power—the more the merrier.
The latest version of Unreal Engine offers plenty of virtual production support for multiple GPUs in the same system. Unreal Engine is cross-platform owing to its ability to create games for Mac, PC, and mobile devices.
You’ll need a powerful Windows desktop PC for virtual production, as many of the key components and plugins for virtual production, such as camera tracking and LED panel support, are only available on Windows.
You can get started with an off-the-shelf gaming PC with at least ten CPU cores and a powerful GPU. An NVIDIA GeForce RTX or whatever its current descendant is as of this reading is good for artist workstations.
For the actual nodes connected to the video processors of your LED panels, you need something more robust such as the NVIDIA RTX A Graphics Card and the Quadro Sync II Kit.
When it’s time to step up into a semi-professional or professional virtual production system, you might want to consider a purpose-built system from an experienced integrator like Puget Systems, with whom I’ve had some great experiences.
A quality integrator can ensure you have a system that performs well and doesn’t blow its fans nonstop. That’s less of a concern for a gamer, but if you have a computer anywhere near a stage where you’re trying to record sound for picture, you’ll want a quiet system.
The bottom line is to get the most powerful system you can afford.
The more GPU power you throw at your PC, the more real-time quality and performance you’ll achieve and the longer the system will last until you need another upgrade.
And remember, you may need multiple PCs if your volume has multiple surfaces.
Whether you’re building your own LED wall from scratch or working on a rental soundstage, one of the most critical technical attributes is the pixel pitch of the LED panels.
Put simply, pixel pitch is the distance between individual LED lights on the screen and is measured in millimeters.
Because you’re re-photographing the screen, the pixel pitch directly correlates to how the image looks. If it’s not dense enough, the image can look low resolution. Or even worse, you may see moiré patterns.
A moiré pattern occurs when the image sensor on your camera conflicts with the pattern created by the spaces between the LED lights on the screen.
The higher the pitch is, the more likely moiré will appear when the camera focuses close to or onto the screen.
For reference, the pixel pitch of the LED panels used on The Mandalorian is 2.8mm. But that screen is also approximately 20 feet tall by 70 feet across, so that the camera can be much further away and less likely to focus on the screens.
If you are working in a smaller sized volume, this can become even more of an issue.
Panels are now available at 1.5mm and even more dense, which can mitigate or eliminate moiré. But these panels become more expensive as the density increases.
So, there’s ultimately a perfect storm to consider which factors in pixel pitch, camera-to-screen distance, focal length, focus plane, camera sensor size, and content resolution to determine whether your footage shows a moiré pattern or not.
That’s not to say moiré patterns aren’t fixable in post with some 2D paint/compositing/blur, but it’s best to avoid entirely if possible.
Keep all this in mind while you choose the pixel pitch of your LED volume and the camera optics and camera, not to mention as you block your shots.
There’s a significant scale continuum between the simplest single wall, rear projection LED setup to the massive volumes used on The Mandalorian, , and others.
In general, the larger the volume, the more expensive it will be to rent or to purchase if building from scratch. So, it’s critical to determine how much LED volume you need.
Some full-service LED wall providers such Line 204, Nant Studios, MELS, and Dark Bay can often customize a setup to match your needs.
Do you mainly need a rear projection screen for car driving/flying shots? Or, do you need a completely enveloping environment with full integration with a foreground set- for creating a new planet or other massive exterior environment?
The choice you make in volume size and form also has a huge impact on interactive/emitted light.
If, for example, you put actors/set pieces in front of a single, flat LED wall, your subjects will be dark silhouettes against the screen, like someone standing up in a movie theater.
On the other hand, if you have LED sidewalls, ceilings, etc., you will have emissive lighting falling naturally on your subject.
To put these in order of decreasing interactive light, the rule of thumb is:
But even if you don’t need or can’t afford an enveloping volume, it’s still very possible to create interactive lighting in sync with the screen content.
Pixel mapping and DMX can turn regular LED full-color lights into the next best thing to peripheral LED panels. In the next section, we’ll dive into this technique.
Digital Multiplex or DMX is a protocol for controlling lights and other stage equipment.
It’s been around for decades with plenty of compatible gear but also has a lot of utility for virtual production. Specifically, you can use DMX lighting to turn multicolor movie lights into effects lights for LED volumes.
You can program specific lighting cues and colors with DMX directly in Unreal Engine or via a lighting board. Or, through pixel mapping, you can set any light on your stage to mimic the color and intensity of a portion of your 3D scene.
This opens up all kinds of possibilities.
For example, if you’re doing a driving scene, you could set up pixel mapping to sample a portion of your background plate footage and connect it to lights above and the sides of your picture vehicle.
You can mimic passing car headlights, street lamps, tail lights, you name it.
And since the DMX is automatically controlling the lights based on the footage, everything is perfectly accurate, synchronized, and repeatable.
To make it all work, you need a DMX compatible light, preferably with full-color control. Some great examples of full-color DMX lights include Arri Skypanels, Arri Orbitors, and Litegear LiteMats.
These all make excellent movie lights, whether you’re using them manually to add fill/kick light to your subjects or pixel mapping them to screen content.
Make sure also to get flags, solids, barn doors, and other flagging gear to control the light and avoid spilling it onto the screen itself. This can look washed out and take away from the realism.
Next, you need pixel mapping software. Unreal Engine has DMX control, so you can control DMX lights directly from within scenes. Some other examples of external pixel mapping applications include Enttec ELM, Resolume, and MadMapper.
Many of these apps were developed initially for VJ usage, so they are intuitive and well-supported.
Finally, you need a DMX to computer interface. These can be straightforward USB boxes like the Entec USB Pro Mk2. Or they can be part of intricate Ethernet networks fed by an interface box like the Enttec ODE Mk2. You’ll also need DMX cables or ethernet to reach all your fixtures, or you can use wireless DMX.
In any case, DMX is such a robust and standard protocol you’ll be up and running in no time.
Colorimetry, or the measurement of colors, is all about the science of color itself.
Understanding your camera’s color science and the display technology powering an LED wall is critical to achieving accurate, realistic images.
Color science is essential when working with a digital camera in any situation, doubly so within an LED volume because you’re using one digital device to rephotograph the output of another digital display.
Not mastering colorimetry for your virtual production can result in various visual artifacts such as reduced color gamut, flat, unrealistic imagery, etc.
The light cast from LED screens can also cause unexpected/undesirable results depending on what surface it falls on. Watch out for metamerism, which refers to the visual appearance of an object changing based on the spectrum of light illuminating it. LED panels are designed to be looked at directly, not act as lighting sources.
If you check an LED panel with a color spectrometer such as the Sekonic C-800, you’ll see distinct spikes in areas of the color spectrum vs. the full spectrum you’ll get from a modern LED movie light. The result of spikes in the spectrum can be weird color shifts on costumes and props and waxy/unrealistic flesh tones.
All of this takes away from the believability of your in-camera effects.
One way to mitigate this issue is to supplement the emissive light coming off the LED panels with additional movie lighting. It’s more work to set up but the results are worth the effort.
As virtual production becomes more popular, manufacturers are also responding by developing LED panels with better full-spectrum color science.
You’ll also want to find out what the color space of your LED wall is, i.e. 8-bit, 10-bit, or 12-bit color space. Having a high-quality video processor from a company such as Brompton or Megapixel VR is key for best results.
Find out the color space of your camera LUT and of the render node that’s outputting from Unreal Engine or whatever is driving your onscreen content. Make sure you have high-quality, color-accurate, and large HDR monitors on-set to evaluate the in-camera image.
Understanding how the color spectrum works and where its challenges are within an LED volume is fundamental.
There’s a perception that an LED volume means you’re always getting final imagery in-camera.
And while this can be true for many shots, it’s not true for every last one.
For example, the percentage of final shots captured in-camera on The Mandalorian was around fifty percent on season one, according to ILM. The finality shot captured in an LED volume can vary from “ready to edit” to “requires some additional enhancements.”
Don’t think of this as a zero-sum game. Think of more on a continuum of potential additional finessing in post vs. all or nothing.
Perhaps the join between the LED wall and the physical set is not perfect and requires a 2D blend. Or the colorimetry is off-balance and requires some power windows in color grading. Or maybe there are additional 3D visual effects that need to be added to a shot. Or you just didn’t have an environment entirely ready for the desired angle.
So you quickly swap from Unreal to a green screen halo around just the actors and keep the rest of the virtual environment up for lighting reference.
Most visual effects supervisors who’ve worked in LED volumes agree that it’s far easier to fix visual issues with in-camera effects shots than to start with green screen cinematography and add the entirety of the background imagery in post-production. It’s a reductive and refining process vs. starting with a blank canvas.
And along the way, the references for lighting, actors, camera framing, production design, editorial, etc., are beneficial.
So don’t obsess over capturing everything in-camera; instead, be satisfied with capturing most of the shots on the day.
Working on film/TV/streaming includes embracing constant technological change.
Some changes can be minor, such as a new helper app or a faster lens. Other changes can have massive ripple effects throughout the production chain, such as the invention of sound for picture, optical compositing, CGI animation, digital cinema cameras, etc.
Virtual production with LED volumes promises to be one of those earth-shattering changes that will completely revolutionize how movies are produced and what kinds of images are possible.
But virtual production itself is in a constant state of rapid change.
What was completely impossible or highly difficult to accomplish one day may be standard operating procedure the next.
One great example of this rapid development is Unreal Engine, which consists of no less than three major available versions that you can download and use for virtual production.
New features in Unreal Engine 5, like Nanite virtual geometry and Lumen real-time global illumination, are raising the bar. Each version offers advancements that will make things faster and more realistic in virtual production.
Another recent innovation is GhostFrame, which uses an ingenious method of sync offsets to show different imagery on the same LED screen to multiple cameras. This opens up plenty of new workflows and possibilities for production.
Look out for additional future advancements in AI, camera technologies like LIDAR, and in streaming/remote collaboration. There’s a whole universe to explore and enjoy.
So, to save your time, sanity, and budget, embrace constant change. Attend many webinars, watch a lot of YouTube videos, read all you can, and above all, experiment.
There’s a lot to learn about this technique, but there’s also an incredible amount of fellow learners and resources to help you along. (I’ve listed a few of these resources at the end of this article.)
Virtual production and in-camera visual effects with LED volumes is complex, cutting-edge technology.
Hopefully, with this essential list, we’ve given you some new things to contemplate as you take your first steps into this larger world. While it’s impossible to know too much about the technology, the cool thing is it’s intuitive and enjoyable.
Many first-timers in LED volumes quickly become converts to its “what you see is what you get” workflow. I sincerely hope you’ll have the same positive experience and do some incredible work you didn’t think was possible. And if you do get stuck, folks like me, and many others in this community, are learning and sharing their experiences.
So please drop me a line, and I hope to see you someday in the volume!
Are you interested in learning more about film cutting machine manufacturer? Contact us today to secure an expert consultation!
Comments
Please Join Us to post.
0