DAM – ProVideo Coalition https://www.provideocoalition.com A Filmtools Company Thu, 09 May 2024 13:16:12 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.5 https://www.provideocoalition.com/wp-content/uploads/cropped-PVC_Logo_2020-32x32.jpg DAM – ProVideo Coalition https://www.provideocoalition.com 32 32 DAY 1 – NAB 2024 Interviews from the floor, Sunday April 14, 2024 https://www.provideocoalition.com/day-1-nab-2024-interviews-from-the-floor-sunday-april-14-2024/ https://www.provideocoalition.com/day-1-nab-2024-interviews-from-the-floor-sunday-april-14-2024/#respond Thu, 18 Apr 2024 18:00:18 +0000 https://www.provideocoalition.com/?p=278864 Read More... from DAY 1 – NAB 2024 Interviews from the floor, Sunday April 14, 2024

]]>
Veritone

We talk to Veritone’s Sergey Kuznetsov and Jenn Goode about the company’s long history of integrating AI with Media & Entertainment, their real-time tracking of ad performance, and their new initiative, ‘Ask Veri,’ that provides a chat-based method for querying digital asset management databases. See a video demonstration of Ask Veri here.

Disguise

Addy Ghani from Disguise shares a little of the history of the company and the recent dive into virtual production. We talk about their 2.5D offerings and their complete service from servers to full integrations. Addy also discusses the announcement of a new line of sub-$100K solutions.

VEGAS Pro

Gary Rebholz discusses the history of this venerable editor, and the new features of the latest release cycle, including practical AI implementations.

LumaTouch

Co-founders Terri Morgan and Chris Demeris share about their unique and popular mobile touch-based editing platform

The Prismatic Company

Matthew and Abe Feinberg discuss their new platform for iterative content generation, where ideas can iterate from concept to completion in the same design space.

Digital Anarchy

Jim Tierney talks about their tried-and-tested set of plug-ins, along with some of the new AI-based selections for fine-tuning beauty work.

Puget Systems

Matt Bach, Lab Supervisor at Puget Systems, shares insight into which hardware components really matter when designing a great editing workstation. He also explains what sets Puget apart from standard workstations and gaming rigs.

Zero Density

Raul Alba describes the company’s solutions to virtual production, scaling from greenscreen stages to large-scale network broadcast LED volumes.

]]>
https://www.provideocoalition.com/day-1-nab-2024-interviews-from-the-floor-sunday-april-14-2024/feed/ 0
Review – mLogic’s LTO8 Thunderbolt 3 Archiving Solution https://www.provideocoalition.com/review-mlogics-lto8-thunderbolt-3-archiving-solution/ https://www.provideocoalition.com/review-mlogics-lto8-thunderbolt-3-archiving-solution/#comments Fri, 14 May 2021 04:09:37 +0000 https://www.provideocoalition.com/?p=238250 Read More... from Review – mLogic’s LTO8 Thunderbolt 3 Archiving Solution

]]>
I’m going to start out this review by saying that LTO drives scare me.  Not in the “Scary like clowns” kind of way, but any work I’ve ever done with them has been a complete pain in the butt, having to deal with different drives, different LTO types, different software, but I find myself in a unique situation where an LTO drive is going to come in very handy, and it’s time to take the plunge.  I did some digging around, and came across mLogic.  They specialize in not only rack-mounted, internal and combo LTO drives, but what really caught my attention is their external Thunderbolt 3 LTO drives.  I have a very large project coming up in 2022, and I have to be smart about how I’m going to manage and keep my media safe, so LTO is going to be the way to go, but I’m super hesitant about the drive, the archive/restore process, and the overall price.  So, I reached out to mLogic to let me take one of their mLogic LTO drives for a spin, and here’s how it went.

BACK STORY

So, I thought it would be important to preface the review itself with why I think an LTO option is good for the job that’s going to be coming up.  In 2022, I’m going to be doing a ton of Media Management in a very short period of time, let’s say 45-60 days.  We’re talking about 3-5TB of footage acquired each day, almost every day for that stretch of time.  Let’s say it’s going to be about 30 instances of 3-5TB of media, so anywhere between 90-150TB of media.  Some people would just say “go with hard drives”, and yes that’s a possibility, but my concern is that this media is essential.  If a drive or RAID fails, we lose media that we’ll never get back.  There’s no reshoots.  It’s one and done, and I need to make sure that 100% of media is securely archived, and some is available for daily editing.  That’s where LTO comes into play.  Copy multiple hard drives over from separate locations to one central storage, and then daily archives to LTO.  With that said, LTO 8 is what I’m looking at.  So that’s the backstory, now let’s take a look at the LTO drive I chose to take for a test drive, and why it’s the one for me.

WHICH LTO DRIVE TYPE IS RIGHT FOR YOU?

I have a PC with Thunderbolt 3 built in, so that’s what I’m going with (want to hear the backstory of that, check out my article about building my own Media Composer PC).  However, mLogic has a lot of different options for you to choose, when it comes to LTO drives.  They have 1U rack mountable versions, SAS (Serial Attached SCSI) internal & external versions as well as, what we’ll be looking at in this article, the Thunderbolt 3 external LTO drives.  Now, with all that being said, there’s something else you need to take into consideration, and that is the version of LTO you want to archive to.  Currently, mLogic has LTO 6, 7 and 8 drives which basically gives you more storage capability per tape, the higher the LTO version you go.  What’s also important is that you’ll see the term “Ultrium LTO” on tape stock as well as drives.  “Ultrium” is the format that you are copying to, so you’ll want to look for that on the stock you buy, so let’s talk about that now.

SIZE IS KEY

As I mentioned above, depending on the LTO drive you decide to get, that will directly determine the maximum size you’ll be able to fit on a specific tape.  What’s also important to consider is compression.  Now, for the purposes of this article, we’re going to throw the compressed sizes out the window as, for the most part, video clips can’t be compressed (ProRes, etc).  As you can see from the chart below, LTO6 cartridges max uncompressed tape size is 2.5TB, jumping to 6TB for LTO7 and 12TB for LTO8.  You can also see that when it comes to the speed that the drive can archive at for uncompressed data, you’re looking at 160MB/s for LTO6, 300MB/s for LTO7 and 360MB/s for LTO8.  Now, at the end of the day, I always hate the MB/s number, because I don’t really care.  Just tell me how fast I can fill a 12TB tape at the 360MB/s speed.  Well, you can see below that to fill an entire tape for LTO6 you’re looking at 4h20m, LTO7 5h33m and LTO8 9h16m.

TAPE COMPATIBILITY

Something else that’s exceptionally important to keep in mind is that the mLogic LTO8 (mLTO8) drives are backwards compatible when it comes to tape stock.  I’m not going to go into details about generations pre-LTO8, as that is what we’re focusing on here.  mLTO8 drives can read and write to both LTO8 and LTO7 tapes.  However, keep in mind archiving to LTO7 does limit you to the 6TB tape size, so to be honest, I’d only be using LTO7 functionality to read older tapes, as I don’t want to be limited to the 6TB (not compressed ceiling of LTO7.  Alright, let’s now get in and take a look at what you get with your mLogic LTO drive when you purchase it.

WHAT YOU GET

The unit itself is pretty straightforward.  You get the unit, a Fuji LTO8 tape, the power cable and a Thunderbolt 3 cable.  Now, here’s where things get really interesting.  You will need to purchase a program that will take your media from your system/drive, and copy it over to the LTO drive/tape.  This program is included in the purchase price of the drive, and there are a few options if you’re on the PC, and one if you’re on the Mac.  Setting up the drive was simple.  Plug it into the wall.  Plug the TB3 cable from my computer to the drive.  Keep in mind that you can use the TB2-3 adapter if you’re running an older Mac.

mLogic LTO 8 Drive Front
Front of the mLogic LTO8 Drive

 

mLogic LTO 8 Drive Back
Back of the mLogic LTO8 Drive

 

mLogic LTO 8 Drive Box Contents
What comes in the box

For Mac users, included with the drive at no cost ($299 if you purchased it on your own), is Canister from Hedge.  And I’m just going to say this now.  I tested Canister with the mLogic LTO8 drive and it, quite frankly, had to be the absolute easiest thing I have ever done, when it comes to archiving and restoring media from an LTO drive.  Here’s how it works.  First, it weighs in at a whopping 11.6 MB, and the first thing Canister wants to know is what manufacturer your LTO drive is from.  We’re obviously mLogic, so we’ll choose it.

mLogic LTO8 Driver download
Downloading the drivers for the mLogic LTO8 Drive

Something that I want to point out before moving forward is that there will be three apps you actually have to download and install.  Fuse.dmg, IBM_LTFS.pkg and ICU_IMB.pkg.  Best part is that Canister will download all these for you.  You’ll just have to deal with MacOS’s annoying security features where it doesn’t want you installing “Non-Apple” developed apps.  Choose to override that, and you’re all set to go!  Now that you’ve got your drive set to go, you’ll need to install the cataloguing app, so you can see the catalogue of each tape, so you can quickly track down files.  To install it, simply click on the icon in the upper right of the Canister application and, again, Canister will do all the work for you.  You just have to click “Install”, and you’ll be all set to go.

Let’s archive something.  When you launch Canister, it’s sitting on the Archive window, asking you to choose a Drive or folder to archive.

Canister Archive Window
Basic, but effective

Once that’s done, simply click “Next” to move onto the “Destination” window.  Once there, you can now “Erase” an LTO tape by simply clicking the three horizontal lined button that appears when you hover over the drive icon.  In the Erase LTO window, you can give you tape a name, and a serial number for cataloguing purposes.

Erasing your LTO tape in Canister
Erasing your LTO tape in Canister

That’s really it.  You’ll now see your drive/folder in the Queue, and you can simply hit Start Transfer, and just wait for it to finish.  Yep.  That’s it.  Super simple.

Alright, now that we’ve discussed archiving, let’s talk about restoring.  First, insert the tape you want to restore from, and launch Canister (assuming it’s not already open).  Next, click the “RESTORE” tab at the top of the interface, and you’ll see the drive icon in the middle of the interface.  Simply hit the three line icon in the upper right of the drive icon, and select “Mount”.  Now, don’t worry if your drive sits there for a minute, thinking about what it’s doing.  Eventually the drive will kick it, and mount the tape.  Keep in mind, it’s not mounting it on your desktop.  It’s mounting it in the Canister application.

LTO8 Tape mounted in Canister
LTO8 Tape mounted in Canister

Once the tape is there, you have a couple of options.  You can see all the catalogues of tapes that have been archived, by clicking on that icon in the upper right corner (same one you used to install the catalogue library)

mLogic LTO8 Tape Library
Library of Tapes that have been archived (one in this case)

Or you can, again, click on those three horizontal lines that appear when you hover over the mounted tape, and hit “Select Files”.  From there you’ll be taken to a Finder window, where you can select files or folders to unarchive.

mLogic LTO8 Unarchive options

I have to be honest with you.  Canister is the simplest, most effective way to get your feet wet with LTO drives, it is worth the $299 price tag that you didn’t have to spend, as it’s included if you’re a Mac user.  Now, with that said, for Windows users you have some choice, so let’s talk about what to do if you’re a Windows user.

I’ll start out this section by saying that I didn’t know anything about any of the companies you have the option of choosing, when you purchase your LTO drive.  This has a direct impact on the price.  Different companies charge different prices, so I decided to give the Archiware P5 Desktop edition a spin, to see how easy it would be to do an archive.  Here’s how it went.

The first thing that I do want to point out is the P5 is available for both Windows and Mac, so if you currently have an LTO drive, and are looking for some software for it, this is an option as well.  Alright.  I have my license, and I’ve downloaded and installed the software and when I hit the P5 Client shortcut I get the below screen.

Archiware P5 Web Interface
Archiware P5 Web Interface

So, the first thing to do is punch in the username and password (that can be changed after the fact if you need to), and once done, I’m brought to the main home window, where I see my license, and everything that it unlocks.

If it’s your first time using the app, you can simply enter your license manually, and you’ll be all set to go.  So, where do we start?  Well, since I want to do an archive, I hit the “Archive” button at the top of the interface, and I’m now sitting on the “Getting Started” option.  If you don’t see that window, you can simply click on “Getting Started” in the sidebar, and you’ll now see an Archive Wizard, here to get you rolling.

Archiware P5 Archive Setup Wizard

So, let’s follow it through.  The first thing that happens when I hit “Start Setup Assistant” is that I’m brought to the “Select Target Storage” option, where I can choose my mLogic LTO8 drive.  Keep in mind that I am running the drive with Thunderbolt 3 natively on my PC using a Gigabyte Z490 motherboard with onboard TB3 ports.

Archiware P5 Drive Select
Archiware P5 Storage Selection

 

 

Archiware P5 Drive Select 2
Well, that was easy!

Once I’ve selected my drive, it’s time to choose what I want to backup.  No problem, simply navigate through your drives, and choose the folder/s  you want to backup.

mLogic LTO8 Data Selection
Choosing which data to archive.

Once that’s done, you’re ready to pick a time (or start it archiving right away).

Archiware P5 Archive Schedule
Creating an Archive Plan

That’s really about it.  Once you’ve started to back up folders on your system, you’ll start creating an archive that you’ll be able to access, via the P5 web browser app, that you can easily set up your archives to run once or every day at specific times if you needed to, and unarchiving is just as simple.  One thing that I do want to point out, and that is that I’m only scratching the surface of what P5 from Archiware can do.  I have to say that I had initial hesitancy with doing archives to LTO from Windows, as every application I have ever used I’ve found very convoluted and confusing, but any hesitancy I may have had was put to rest with Archiware’s P5, as they have put in a simple to use archiving wizard to get you archiving within minutes of opening the web interface, AND you can start out simple, and expand your archiving needs with this super in-depth application.

DOWNSIDES

If I had to pick a downside to the drive, there really is only one.  It’s loud.  Really loud.  Even when it’s not archiving.  Loud enough that I wouldn’t run it while I’m trying to edit, as it has to sit pretty close to my computer, as the TB3 cable is only so long.  If your CPU is in another room (CER – Central Equipment Room), this won’t be an issue, but just keep it in mind that you don’t want to have a zoom meeting while it’s running, as the noise is….well…..noticeable.

IN THE END

I have been in the market for an LTO archiving solution for a while now, and mLogic’s LTO8 drive is, in a word, awesome.  What can you say about dirt simple, as that is what pretty much everything has been with the drive.  Simple to setup, simple to archive to and simple to restore from.  Don’t let the base price of $5000 US scare you away (Mac users, as Windows users will need to pay more to have an archiving app included with the drive), as what you’re getting is 30 year safety on each 12TB (not compressed) tape that you put on your shelf.  If you’re a Mac user, getting Canister with your drive is an absolute steal, as it’s easily worth every cent of the $300 price tag if you were to purchase it on its own, and Windows users out there can rest easy, as Archiware from P5, although at an additional cost, is simple to get up and running with, and have your projects archiving, minutes after you have used them.  You can find out more about mLogic’s LTO8 drive at the included link.

]]>
https://www.provideocoalition.com/review-mlogics-lto8-thunderbolt-3-archiving-solution/feed/ 3
The DigitalGlue //ROGUE: The NAS you didn’t know you could afford https://www.provideocoalition.com/the-digitalglue-rogue-nas/ https://www.provideocoalition.com/the-digitalglue-rogue-nas/#respond Tue, 07 Apr 2020 18:26:48 +0000 https://www.provideocoalition.com/?p=176911 Read More... from The DigitalGlue //ROGUE: The NAS you didn’t know you could afford

]]>
rogue-chassisDigitalGlue’s //ROGUE unit is a robust, affordable 48TB NAS drive for video editors and digital creatives. But what makes it truly unique in its price range is the managed storage software and remote monitoring service that comes with it.

I’m always nervous when asked to review a product by a site sponsor. I don’t want to be caught in the position of endorsing a product I wouldn’t actually want to own myself. Fortunately, the DigitalGlue //ROGUE turns out to be an exceptional product with a surprisingly reasonable price tag (starting at $4,795 for the 48TB version). It packs a massive amount of user control into an intuitive GUI, but it’s ultimately the support service that makes it a true turnkey solution for artists who don’t want to pay for (or be) a dedicated IT support staff.

For the most part, I’m a DIYer. (My wife would probably just call it being stubborn and cheap.) So I’ll usually roll my own solutions before using any kind of systems integrator. However, one arena where I’ve learned to leave it to the specialists is storage solutions. I was taught that painful lesson when my studio almost lost 3 screen minutes of VFX on a major Hollywood release to drive failure. Watching experts come in and rebuild header information convinced me that being thrifty with IT isn’t the best plan when hundreds of thousands of dollars of work are stored on the drive platters.

Let’s dive into the //ROGUE’s specs and features first, then talk about what the service contract adds to the equation.

system_2The Hardware

First of all, this is not just a fast RAID drive you connect to the back of your Mac Pro to use for your video editing. This is a fully managed storage solution with multiple user accounts and the ability to assign drive space to different users in the same facility. It’s also equipped with the modern ZFS file system—an actively self-healing volume manager. But let’s set that to the side for a moment and look at the hardware.

To connect to your network the //ROGUE sports 2 x 10 GbE ports and 2 x 1 GbE ports. Obviously to take full advantage of the RAID speed you’ll want to connect via one of the 10 GbE ports. You can do this either by directly linking it to a single workstation (if you’re a one-man/woman show), connecting two workstations via the two dedicated 10GbE ports, or by connecting the unit to a 10GbE switch which then distributes it to two or more workstations at your facility. By the way, the new Mac Pros ship with an included 10GbE port, as do many modern PC motherboards.

Just how many edit workstations you’ll be able to feed from the //ROGUE is going to depend on your codec choice and the power of your switch. 4K codecs range from highly-compressed offerings like Panasonic GH4 at 100 Mbps, all the way up to RAW 4K codecs like Phantom Flex at 3.5Gbps.

In my testing I had no problem editing native UHD 4K DNxHR HQX in the Resolve timeline. In fact I could comfortably get 6 individual streams (scaled on the fly to picture-in-picture) to play back simultaneously. Part of the reason for this is the massive amount of RAM in the //ROGUE (more on that later). With that many streams you may get a “bump” as the media starts pulling from the RAID, but it efficiently transfers the streams to a RAM buffer so that ultimately it’s streaming from RAM rather than directly from the spinning disks. I found that with the footage in the RAM buffer I could rapidly scrub several seconds of those 6 streams back and forth without a glitch.

I mentioned a moment ago that your switch could also affect performance. All network switches are not created equal. A managed switch with a powerful processor and good prioritization protocols will be much better at getting the right data where it needs to go than a sub-$100 switch. Your switch choice is something you’d want to discuss with the DigitalGlue team before taking delivery of the //ROGUE.

Drive Configuration

The unit I tested was the 48TB version, with 4 x 12TB spinning disk drives configured as RAID 6, giving me roughly 24TB of usable storage. For those unfamiliar with RAID numbers, RAID 6 offers a high degree of protection against drive failure at the expense of total storage space. Three of the four drives would have to fail for you to lose any data. If you want more space and slightly better write speeds, you can configure the unit for RAID 5, which still allows one drive to fail without losing any of your precious data. (It also supports RAID 1, RAID 10, and RAID 0 for the reckless, Ferrari-driving mavericks of the world.) For video editing I’d personally choose RAID 5, but for a general-purpose studio HUB for use with editorial, motion graphics, animation, and VFX work, RAID 6 is the safe and sensible choice.

system_1

But that’s just the RAID…

In addition to the RAID, my test unit included two optional solid state drives: a 2TB M.2 NVME and a 1 TB M.2 SATA (as well as a 500GB SSD for the system OS software). Why? Well, the 2TB NVME offers crazy fast performance for caching exercises. Think Resolve or Premiere render cache, or After Effects or Nuke disk cache. I wasn’t expecting that kind of addition in a NAS box. In my testing I found caching to be almost invisible, thanks to the NVME transfer speeds.

Disclaimer: my test system includes a 32 core CPU and 2 x RTX 2080 Ti GPUs, so I wasn’t exactly waiting for caching to finish… If you are on a machine with multiple cores, try adding the command: Local.DtManager.1.Threads = 8 to the Advanced tab in Resolve’s preferences, where the “8” is the number of threads you want it to use. 8 is the default, so increasing this can improve performance. Also, make sure Direct I/O is enabled for the Rogue Mount in Media Storage prefs (it should be), so that the system file cache doesn’t get in the way.

compare_copy_1

The other M.2 drive served (in my case) as a fast location for the Resolve database. The guys at DigitalGlue tell me that while there is already support for Premiere Pro “Productions,” even tighter integration is planned for the other major editing platforms. If you want to use the SSD for your normal production data, the Copy and Sync tool in the web app makes it fast and easy to manage data within the system between the hard disk and solid state drives, with conflict resolution options that include pushing overwritten files to an archive folder. Pushing the files to a timestamped subfolder makes it easy to relink them, since the file names aren’t changed. The Copy and Sync tool is especially helpful for copying lots of files, like image sequences, since it doesn’t have to “prepare” first like it does during standard Windows and Mac OS copy operations.

All in all, the solid state devices take the typical I/O requirements away from the local workstation and push them to the NAS. In other words, your workstation can have pretty humble local drive specs and still get smoking performance by leveraging the fast random access of the //ROGUE’s NVME drive.

This has interesting implications for general studio work. If you do digital work across the board—editing, animation, compositing etc. then you can potentially save money spent beefing up individual machines, and benefit from a centralized data hub rather than worrying about housekeeping half a dozen local hard drives.

Note: These SSDs are an add-on for the //ROGUE. The NVME M.2 drive is available in both 2TB ($595) and 4TB ($1695). The SATA M.2 for the DaVinci Resolve database comes in 500GB ($249) and 1TB ($399). Both of the M.2 drives go on the same PCIe card (which includes fans and heatsinks) , so it’s cheaper to purchase them together: The options are to get the 2TB NVME and 500GB SATA ($795) or the 4TB NVME and 1TB SATA ($2,049). I think this is an essential part of the system and the $795 price point is well within the current market range for integrated solid state storage.

The Brain

The CPU running the show is no slouch: The system I tested sported a genuine 4 core Xeon D-1518, more than enough horsepower to efficiently push data around while performing all the system diagnostics, automatic defragging, and drive sector self-healing (we’ll get to those in a bit). It also came with 64GB of ECC (error-correcting server-grade) RAM. I was a little surprised by how much RAM was supplied on a box like this. The DigitalGlue techs explained to me that due to the way the advanced ZFS file system operates, 1GB of RAM is required for each terabyte of drive space. So 64GB is enough to cover the 48TB of drive space, while still leaving plenty of headroom for other system tasks.

The generous specs show: I never experienced any of the freezes or sluggish UI moments I’m accustomed to finding in many commodity headless Linux systems. The UI was snappy even when copying massive data sets. And as I mentioned earlier, that RAM really helps to smooth out access buffering when pulling multiple streams from the spinning disk RAID.

moredials

The Software

With a lot of NAS solutions the review would end here. But the //ROGUE includes what would typically be called “enterprise class” software. In fact, the //ROGUE is running the exact same software system that DigitalGlue creates for their larger enterprise clients. Those clients include most of the major television networks, as well as high profile creative studios like Local Hero.

The vast amount of configurability and system reporting may be overkill for most small studios, but the UI is designed in such a way that they don’t overwhelm. Stats like CPU load, TCP Packet and UDP socket activity, and IPv4 errors are there if you need them, but can be ignored if you don’t. More importantly, they’re available to DigitalGlue’s remote monitoring team, who can alert you when they do find something awry.

The system interface is built around a modern, responsive AJAX-style web service, accessible from any browser with network access to the //ROGUE (either local or via VPN). Let’s take a look at some of the standout features.

User Management and Shared Spaces

One of the key features of //ROGUE’s user management is the idea of “Spaces.” Spaces are virtual partitions you can use to carve up the available storage between projects and/or users. Unlike actual drive partitions, spaces are completely dynamic. Each space can be set with a maximum size limit, and a maximum space reservation.

This means that you can limit a space (and the users that have access to it) to not grow beyond a certain size and you can reserve storage in anticipation of a growing project, so that none of the competing spaces fill up the drive before that project starts to make use of it. One rule of post we’ve all learned: regardless of the number of petabytes, if the drive space is there, it will fill up. This one feature alone can save all kinds of headaches and phone calls trying to figure out what data is safe to delete during an emergency delivery schedule.

User management is implemented using a full LDAP server. If you don’t know what that is, don’t worry; basically it’s the same underpinnings used by systems like OS X server to manage secure user permissions. Users can be assigned to groups, and permissions can be set at pretty much every level of the drive, from spaces to individual directories if needed. No more junior PA’s accidentally deleting the entire work-in-progress directory…

Once a user account is created, it can be easily be deactivated and reactivated. That’s great for recurring freelancers; when they return for the next gig you can simply turn back on their account and they pick up where they left off. By assigning permissions to groups and then moving users in and out of those groups, you can change out your team on the fly without having to manually change any permissions.

And I need to emphasize this: account management is easy, thanks to the clean, intuitive “Team” section of the management software. If you know how to set up a User Account for yourself in OS X System Preferences, setting up users in the Team panel will be a breeze.

team_users_page_1-2

Self-healing and the ZFS File System

I have to confess my knowledge of file systems has become a little rusty of late, so I was interested to hear about the feature set of the ZFS file system used by the //ROGUE. Even though each space can present to client machines as either an SMB or NFS mount, ZFS is the file system that manages all the data at the drive level. While ZFS began development back in 2001, it’s only been since 2013 that the open-source platform OpenZFS has been free to proliferate outside of the corporate confines of Oracle and Sun Microsystems.

For those curious about ZFS, I’d encourage you to check out the excellent Wikipedia article on the subject. For everyone else who just wants to store cool stuff on it, the important takeaway is that ZFS is super-efficient and designed to protect you from data loss. It does lots of hi-tech voodoo with checksums to identify write errors and fix them (“self-healing”), protects against failed copy and move processes, and can even work with the hardware RAID to make sure data can be rebuilt even when the RAID controller hardware itself gets damaged. (Anyone else been faced with a fried RAID board that stopped being manufactured years prior?)

ZFS will also “learn” how to best serve up your data. It uses machine learning to balance what data to purge from the cache with two methods: Least Recently Used and Least Frequently Used. When the system has to go to the drives to fetch a file that was previously in the cache, it adjusts its algorithm to keep the right data accessible. This means it can adapt to your “consumption” style. It can serve data one way when you’re working on a long-form edit and another when you’re jumping between a dozen small clips in an After Effects project.

The DigitalGlue team will be able to fine tune some of that performance for you once the system has gathered data from the first month or so of usage.

Snapshots

Another huge feature of ZFS is snapshots, something that DigitalGlue has implemented very elegantly. You can schedule snapshots of your current data down to each hour of the working day, and the ZFS system uses metadata to track changes in files between snapshots. They’ve even added custom logic so that you can set a date and time for each snapshot to expire. For the most part, snapshots require very little additional drive space, while allowing you to roll back the state of your drive in the case of accidental file deletion or overwrites. Users of OS X’s Time Machine function will be familiar with the concept.

file-browser_1File Browser

This might not immediately sound exciting, but if you’ve ever gone searching through months (or years) of media debris for that image you vaguely remember the name of, you’ll appreciate the //ROGUE’s file browser, known as the Spaces Browser. It indexes all files on your NAS for fast searches, displaying previews in-browser for images, PDFs, text documents, and web-friendly video files.

The Spaces browser also allows permissions editing at the individual file and directory level, allowing you to carefully lock down (or open up access to) files at an atomic level, if the need arises. You can give a user access to only the file or files they need to see; no risk of them seeing another client’s content as they access those files.

The //ROGUE comes with a built-in Domain Name Server (DNS) that lets you set a Broadcast Name. For example, the system I tested with was “rogue04.local”. Whatever name you set will have the same “.local” extension. Normally, users connect using an IP address, which means that each user will likely have a different URL. Using the Broadcast Name to connect to the web app means that everyone will have the same URL even when exposing your system to the internet or providing access through a VPN. This allows you to copy links in the Spaces browser and share them with others who can access your //ROGUE. The browser includes a download button for files, so you, your team, and your clients can easily pull down files over the internet without having to mount the Spaces to your desktop.

Desktop App

The web app makes it easy to copy links to Spaces and folders to mount them manually, but if you’re on a Mac the creative.space desktop app takes it one step further. All you have to do is put in the IP address for the storage once and then it will remember it. It even lets you put in a custom name so you can easily keep track of all the systems you’re connected to. Once you log into the desktop app as a user, mounting spaces and folders is as simple as clicking the “mount” button.

You can bookmark the spaces and folders you use most often to make them easier to reach. There are a few more cool features like viewing in Finder and applying folder structure templates. The nice thing is that the app will keep track of what you’ve mounted, even when you log out. So all of your Spaces will automatically mount the next time you open the app. If you need to get to the web app, there is a button in the desktop app that will launch it in your default browser.

If you’ve ever had to walk someone through how to connect to storage, you’ll appreciate how the desktop app makes this a one-click process. What’s even nicer is that the desktop app lets you connect to multiple creative.space nodes as long as you can connect with the right IP and user. This means that you can connect to nodes via WAN or VPN and mount spaces the same way you would locally. The performance will only be as fast as your connection to the storage of course, but this might be fast enough for proxy workflows.

System Diagnostics

The actual system monitor panel is a little mind-blowing. First of all, I don’t know that I’ve seen many diagnostic UI’s as clean and pretty. This is true of all the interface panels in the creative.space software that ships with the //ROGUE. I mentioned earlier in the article that you get monitoring right down to network packet activity. Hover over a graph and you can see individual readings for specific slices in time.

Do I want to sit around watching these graphs all day? Absolutely not. But if for some reason playback starts chugging during a client edit session, I like the idea that I can pop open a web browser and see that the intern in the next room is unnecessarily ingesting 10 TB of next month’s show onto the NAS.

One section in particular that comes in handy is the user section. Here you can see how user reads and writes are saturating the RAM and disk bandwidth. Individual graphs for each user help you identify bottlenecks and problem usage.

monitor_1

And then some…

I’ve hit on the main features of the system, but there are a bunch of other goodies in here like load-balanced media ingest via the USB ports, project folder templates, copy-and-paste file paths for both Mac and Windows, a built-in FTP server for remote file access, and so on. Again, while the //ROGUE is squarely targeted at the boutique studio, it inherits the full feature set of the software developed for DigitalGlue’s major network clients.

Getting Up and Running

Now whether it was a printer or a new esoteric PCI card, most of us have spent a day (or three) wrestling to get some new piece of tech talking to our other equipment. Not so with the //ROGUE, because DigitalGlue provides the equivalent of a “white glove” delivery service. Before shipping they contacted me to preconfigure the static IP address of the unit to match my local network subnet. (Don’t worry, two of the ethernet ports come preconfigured for DHCP, so you won’t get locked out of the box if you switch networks—unlike other hardware I’ve owned over the years.) You’d also want to discuss the best RAID level for your needs.

Once the unit arrived, the DigitalGlue support team arranged a call to walk me through setup and configuration. The drive already came formatted, configured, and ready for work. There were only a couple of extra steps: 1.) The team walked through the process of configuring the unit for direct 10GbE connection to my workstation, and 2.) As a Resolve user, they helped me configure the Resolve database and cache to take advantage of the solid state drives included with the //ROGUE.

They also gave me a quick walkthrough of the system software and the basic process of creating spaces, user accounts, snapshots, and permission groups.

alerts_12The Real Magic: Remote Monitoring and Support

Between the hardware and software alone, for the $4,795 offering price the //ROGUE is a great deal. But what makes it a killer deal is the inclusion of an entire year of proactive monitoring by the DigitalGlue team.

In some ways, the //ROGUE is a great marketing strategy to expose new customers to DigitalGlue’s particular brand of software as a service. It offers users that might balk at a full-blown service contract the opportunity to experience the benefits of managed storage through a device they own and keep.

Talking to the guys at DigitalGlue, they raised an interesting point: the cost of even a part-time IT employee at a facility quickly dwarfs the monthly service fee for their larger creative.space storage offerings. Someone buying a //ROGUE today and experiencing the value add of remote monitoring, is going to be much more open as they grow to the idea of paying monthly for storage that comes with DigitalGlue’s support.

One anecdote the team shared with me was of a facility working on a major network show. The DigitalGlue support staff were alerted after normal business hours to unusually high CPU temperatures on the NAS they’d installed there. They called the studio and one of the PA’s went back to discover the facility’s air conditioning in the server room had gone out. They were minutes away from a meltdown not just of the NAS, but of tens of thousands of dollars worth of hardware, and probably hundreds of thousands of dollars worth of content. That’s what you call a value add.

Conclusion

If you’re in the market for a robust, secure storage solution for editing or other creative work, the //ROGUE is definitely worth looking at. Having spent thousands more in the past on unmanaged RAID systems with no support other than a Korean phone number connected to an answering service, I can tell you that you can’t overvalue the support contract that comes included with the purchase. Even putting the active monitoring to one side, the combination of reliable hardware and powerful, configurable software make the //ROGUE an excellent buy at the price.

The //ROGUE system is available for direct purchase from creative.space. The 48TB version is currently on sale for $4,795, and the 64TB version is available for $6,395.

As a special promotion, creative.space is offering interest free payments over 24 months. That makes the 48TB unit $199 a month, and the 64TB unit $265 a month.

As mentioned earlier, if you want to add on the SSDs, the NVME M.2 drive is available in both 2TB ($595) and 4TB ($1695) versions and the SATA M.2 options for the DaVinci Resolve database are 500GB ($249) and 1TB ($399). Since both of the M.2 drives go on the same card, it is cheaper to get them together. The options are to get the 2TB NVME and 500GB SATA ($795) or the 4TB NVME and 1TB SATA ($2,049).

Visit the //ROGUE product page here.

]]>
https://www.provideocoalition.com/the-digitalglue-rogue-nas/feed/ 0
The Pitfalls of Online File Sharing and Sending Services for the Media & Entertainment Industry https://www.provideocoalition.com/the-pitfalls-of-online-file-sharing-and-sending-services-for-the-media-entertainment-industry/ https://www.provideocoalition.com/the-pitfalls-of-online-file-sharing-and-sending-services-for-the-media-entertainment-industry/#comments Wed, 07 Nov 2018 14:17:36 +0000 https://www.provideocoalition.com/?p=79979 Read More... from The Pitfalls of Online File Sharing and Sending Services for the Media & Entertainment Industry

]]>
Easy-to-use, readily-accessible, and consumer-oriented, online file sharing platforms such as Dropbox and Google Drive are, to end users, a pleasant replacement to older file transfer methods such as FTP. FTP is complex to use and requires IT intervention to make almost any change. The pain associated with FTP, which was developed in the 1970s, is one of the factors that opened the door for the rise of online file sharing services in the workplace.

Along with file sharing, for the sending large files, a series of consumer-focused tools have emerged such as WeTransfer to navigate around the size limitations of most email systems. Given the consumer-adoption of such tools, it’s no wonder so many business users are willing to circumvent internal IT systems and use file sending services when they need to send a large file to co-workers, partners or customers.

These trends, of course, make IT managers nervous (especially every time major online file sharing data breaches get publicized). Because now that users are self-migrating from FTP or internal email systems to unsanctioned, public platforms, leaving IT out of the loop – they open their companies to risks and challenges that companies can’t afford to take.

In some cases, EFSS (Enterprise File Sync and Share) systems including Box and Dropbox for Business are options for IT organizations to consider. These typically take a consumer-like offering and add additional management controls and, in some cases, more sophisticated security. EFSS solutions are widely deployed, and they are often considered a good choice for business documents. However, when it comes to sending and sharing mission-critical content or massive video files, such EFSS systems aren’t the best solution either.

“We were using a range of other products like WeTransfer, Dropbox and Box. We had so many different ones that were either client specified or just set up for specific reasons that it was very difficult to manage or help people if they had an issue.” – Michael Ball, Post-Production Supervisor, Accord Productions

Online file sharing and sending (OFSS) services are great for consumers and even for some simple business use cases – there’s no denying that. But when it comes to mission-critical content and the complex workflows of modern M&E companies, these services just aren’t up to the challenge.

 

Seven pitfalls of OFSS services for M&E companies

1. No Acceleration

For all the promises of ease, access and user-friendliness of OFSS services, they fail to address the biggest bottleneck of all in moving large video files to other users: the transfer speed.

While employees might think they’re circumventing the annoying FTP process, they’re still left waiting for transfers to complete. Why? Because OFSS services still rely on traditional TCP transfers that don’t maximize network bandwidth. They are just as slow as FTP, but with a friendlier face.

2. Flawed Security Models

Even if a company advertises that files are encrypted in storage, you have no assurance whether your OFSS provider is following secure design principles. And, let’s say you get the information and your OFSS service is following secure design principles, is your company’s intellectual property and PII (Personally Identifiable Information) data now sitting in someone else’s storage?

Just by using an OFSS service, you could be violating your company’s security policies with respect to PII. Freemium versions of the majority of these products seldom support critical security controls.

3. Storage Lock-in

You can’t choose where you want your assets stored. OFSS services use their own storage – and that could be anywhere. This means your IT department has no control over the actual server where assets are stored.

4. Closed Storage System

Additionally, because you have no control of where the content is stored, you cannot access this storage via other mechanisms. It may be impossible for you to directly interact with your stored files or move your files through an automated workflow that is outside or adjacent to your OFSS storage.

5. File Size Limits

As of July 2017, Dropbox has a file size limit of 20GB per file. That may seem like a lot but with today’s 4K cameras, you could exceed that limit quickly depending on FPS (frames per second), bitrate and codec.

6. Poor Control and Visibility

Without direct ownership of or access to the storage and server management, using an OFSS service introduces a dangerous barrier to corporate visibility. Being able to restrict access and assign granular permissions to files ensures that only the right people have access to the content. And, being able to view, track and audit activities means that if a breach occurs you can pinpoint its source.

7. No Robust Transfer Mechanisms

Even if you decide that OFS meets all your needs, you can’t afford to waste time starting at the beginning. With large files, a Checkpoint Restart function becomes very important. If a file transfer is interrupted due to internet connectivity or other network challenges, having to manually restart or worse having to start the transfer over from the beginning can be a nightmare in meeting deadlines – especially when there’s no file acceleration.

The Pitfalls of Online File Sharing and Sending Services for the Media & Entertainment Industry 3

A solution

The answer is to make the move to a next-generation file transfer solution like Signiant Media Shuttle. Media Shuttle is an easy and highly reliable way to transfer large files fast. More than 25,000 companies of all sizes use the solution and enjoy the enterprise security features and flexibility and control of choosing their own storage.

 

The Pitfalls of Online File Sharing and Sending Services for the Media & Entertainment Industry 4About Signiant

Signiant Media Shuttle is the fastest, easiest, and most reliable way for users to send and share large files. Used by hundreds of thousands of media professionals around the world, this cloud-native SaaS solution employs Signiant’s patented acceleration technology to dramatically speed up transfers over public and private IP networks. Authorized users can log in to Media Shuttle’s branded portals from any Web browser, gaining secure access to content via a super-simple user interface. Behind the scenes, the system can be configured to work with either local storage or cloud storage. Sold by subscription to businesses large and small, Media Shuttle is the de facto standard for person-initiated transfer of large files. For more information, please visit www.signiant.com/media-shuttle.

]]>
https://www.provideocoalition.com/the-pitfalls-of-online-file-sharing-and-sending-services-for-the-media-entertainment-industry/feed/ 2
Artificial Intelligence at NAB 2018: real world applications https://www.provideocoalition.com/artificial-intelligence-at-nab-2018-real-world-applications/ https://www.provideocoalition.com/artificial-intelligence-at-nab-2018-real-world-applications/#respond Mon, 09 Apr 2018 17:48:38 +0000 https://www.provideocoalition.com/?p=70111 Read More... from Artificial Intelligence at NAB 2018: real world applications

]]>
Artificial Intelligence at NAB 2018: real world applications

From local news to sport production, from delivery to remote locations to indexing and storage, Artificial Intelligence seems to be everywhere. We picked some of the solutions on show at NAB 2018.

While you may think AI is only to be applied on a global scale, the truth is that even local markets can benefit from its introduction. That’s something that Ethan Dreilinger, solutions engineer for Watson Media and IBM Cloud Video told attendees to the panel “AI for the Local Media Market”, where the panelists examined real-world use cases for putting AI to work at media organizations such as making production smarter and more efficient, creating personalized and relevant content experiences for users and providing more targeted and effective advertising offerings for clients.

IBM Watson Media is an example of the diverse areas where AI can be used. The company offers solutions that “enable you to infuse AI throughout your media workflow or video library – unearthing opportunities to improve monetization, viewer engagement, content performance, ad revenues, and more.”

AI is mostly used, now, to search metadata and captioning, but the range of areas where it can be used is growing, while the costs of implementation are reduced. Watson Captioning, for example, promises accurate, compliant, and easily editable captions for your video content, saving you time and money along the way. It offers automated caption generation, allows to finalize caption with an easy-to- use editor, is faster than real-time and is self-learning, meaning it becomes more accurate with every use.

Artificial Intelligence at NAB 2018: real world applications

 

Video highlights with AI

Watson Media, though, uses AI other ways, with Watson Video Enrichment being an example. Using industry-leading AI to analyze multimedia content, it then builds easily searchable metadata packages for every asset in your library. This allows for an understanding of video content on a much deeper level, allowing to use it to create stronger viewer engagement, improved content search and discovery, recommendation uplift and… new monetization opportunities.

Real-time highlight clipping is one area where Watson Media can help sports broadcasters needing to create video highlights. Watson automatically watches, identifies and clips those must-see moments, basing its decision on the cheer of the crowd and other key elements, as players reaction, to define the best segments.

Artificial Intelligence and IP seem to go hand-in-hand when it comes to the future, and at NAB 2018 Mobile Viewpoint showcases not only new products that use Artificial Intelligence (“AI”) to automate low-cost delivery of content from remote locations, and improve efficiency of content indexing, storage, and search, but the company also exhibit its broader range of IP contribution solutions.

Artificial Intelligence at NAB 2018: real world applications
The automated studio is coming

Mobile Viewpoint is incorporating AI capabilities into two new products that allow smaller broadcasters and brand owners to automate live content production and delivery without having to invest in costly camera crews, production facilities and distribution platforms. They are targeted at sports stadia and other live events, including studio and field news production.  In addition, Mobile Viewpoint also introduced a new Smart storage solution that uses AI for voice, object and speech recognition to enhance video content indexing, logging and search.

The AI-driven News Pilot is presented as an “automated news direction and cameraman in one” offering “intelligent, unmanned, low-cost live production of news”. It’s a fully-automated software solution based on AI algorithms that automatically switches and controls PTZ cameras, plays graphics and stored content by analyzing 3D video images and audio signals from the TV studio. Automated Studio mimics a real director, leaving the presenters to do what they are good at, which is making a live broadcast.

This automated studio products enables broadcasters to make completely automated news productions without the need of a cameraman or director. A set of 3 controlled camera’s, a switcher and graphics engine controlled by MVP’s Virtual Director software will create a vivid and accurate capture of any interview or news production.

Artificial Intelligence at NAB 2018: real world applications

AI to reduce camera crews

The second product, IQ Sports Producer, allows for “intelligent, low-cost live production and streaming of field sports with just one camera!”, according to Mobile Viewpoint. The IP Sports Producer “makes live and on demand web broadcasting available without the hassle of setting up multiple cameras and hiring a camera crew”, says the company. The system provides an automated video broadcast production with a one 180 degrees camera installation. With this technology, cost effective cameras can be used for complete coverage of the playing field. A pan and zoomed image is automatic created by special developed game tracking technology.

Mobile Viewpoint’s solutions may raise some eyebrows, as many of these AI-driven solutions seem to make, but they meet the broadcasters’ requirements for fast, simple and cost-efficient content contribution as they adapt their strategies to meet the rapid rise in online content consumption. Alongside its new AI-based solutions, Mobile Viewpoint also exhibits its complete Multicam Wireless IP transmission portfolio at NAB, which includes the WMT TerraLink-4C Multicam rack mount bonded transmitter, and the WMT AirLink H.265-enabled encoder backpack device. Both transmitters feature multiple (2-4) H.265- encoders and 8 channel audio, enabling multicamera, OB and 360 productions that can be remotely controlled.

Michel Bais, CEO at Mobile Viewpoint said: “As the industry continues to adapt to new consumption habits and a thirst for live content grows, broadcasters must adapt to new ways of working. We are also adapting our solutions to meet those needs: artificial intelligence will bring new and exciting capabilities to the next generation of content solutions, and we’re delighted to be able to showcase our own AI products at NAB this year. These will add to our existing, strong line-up of mobile content contribution solutions, already in use globally with customers such as BBC News and Sky Sports.”

Artificial Intelligence at NAB 2018: real world applications

Reduce costs, boost revenues

Presented as the world’s most comprehensive live event support tool and a major breakthrough in sports production,  SMARTLIVE, from Tedial, is a totally unique development in the industry which leverages Artificial Intelligence (AI) tools to increase the number of highlights created automatically, thus reducing production costs and boosting revenues for production companies.

SMARTLIVE is a flexible solution to manage live event content: Produce, live Catalog, Archive, Transform and Deliver Sport content to maximize content visibility and monetization in the moment it is happening. The company responsible for the solution, Tedial, says that what makes SMARTLIVE unique in the industry is its capacity to, before any action happens, “ingest a LIVE data feed and automatically create an event inside its metadata engine. Simultaneously SMARTLIVE automatically creates the corresponding log sheets, the player grids and a schedule of the event.  All these preparations are linked and organized by collections, so an entire season of sports events can be prepared automatically in advance.”

During events, live data is ingested, and the system can be configured to automatically create clips based on actions, keywords or logged occurrences. SMARTLIVE automatically pushes content to AI engines; video and audio recognition can be leveraged to generate additional locator data and annotate the media proxies. And the system can automatically publish clips or push content to social media platforms.

Jerome Wauthoz, VP of Products for Tedial commented, “We couldn’t be more excited to bring Broadcasters, Rights holders, Clubs, Federations, or any organization involved in live production a ground-breaking paradigm that will dramatically transform the way they automate clips and deliver content across numerous platforms, including Social Media. By tightly integrating with AI tools, SMARTLIVE can generate an increased number of highlights during or after an event, and deliver to a very targeted audience increasing the potential for significant growth in fan engagement.”

Esther Mesas, CSO/CMO for Tedial, added “Another SMARTLIVE benefit for customers, that can’t be minimized, is the substantial financial gain. Thanks to the automatic highlight engine and AI integration, preparation time and production costs are reduced with more highlights being created with the same or less staff. And more content published means more revenues for these customers.”

At NAB 2018 Tedial demonstrates SMARTLIVE’s automatic highlight creation feature, its integration with AI engines, speech to text integration annotating incoming live media, and the automated publish feature.

MRMC debuts image capture solutions at NAB 2018

MRMC small-scale studio with AI

We’ve already mentioned here at Provideo Coalition, another name associated with Artificial Intelligence, that of experts in camera robotics, MRMC, a Nikon company, that debuted at the 2018 NAB Show a new range of tracking technology and solutions offering more angles and creative control of image and video capture at sports venues, providing cost effective and reliable solutions.

With sports broadcast being a key area, where high-speed precision, flawless control and unquestionable reliability are a must,  MRMC introduced  its new Polycam Chat solution, which simplifies and augments the small-scale studio environment with AI, while minimizing footprint and production costs. The system uses face detection in combination with limb recognition for unrivaled accuracy and stability. The Polycam Chat automates the camera operation for up to four presenters and guests in one studio and can easily track a talking head with maximum stability within the frame. The simple interface makes it easy for operators to use, while the flexible platform means it can be used with a number of different broadcast camera solutions including Nikon DSLR cameras.

Artificial Intelligence at NAB 2018: real world applications

Xeebra and iconik

Another name to look for at NAB 2018 is EVS, which is demonstrating the effects integrating artificial intelligence (AI) into its video-refereeing system, Xeebra.  The new, Xeebra 2.0 versions features a new AI integration which automatically calibrates the field of play – something that’s time-consuming if done manually. As a result, setup time is significantly reduced and operators can easily place graphics onto the calibrated field with the highest level of precision to aid their decision-making process.

Another player offering solutions in artificial intelligence is Cantemo, a company introducing at NAB 2018 its iconik, a hybrid-cloud based hub for managing, sharing and collaborating on media that uses Artificial Video Intelligence to automatically tag content with timecode-based metadata upon ingestion by recognizing even minute entities and objects.

Being cloud-based, and as part of Cantemo’s continuous delivery approach, iconik is in constant development with recent updates featuring improvements to the video intelligence workflow and image analytics. iconik now supports image and video analysis using AWS Rekognition, in addition to Google’s Cloud Video Intelligence.

Artificial Intelligence at NAB 2018: real world applications

AI is accessible and powerful

Veritone used the NAB Show 2018 to announce a breakthrough in terms of real-time artificial intelligence capabilities, and to make sure the message reaches visitors, is present at different booths – Quantum booth 22125, AWS booth SU2022, NetApp booth S111LMR and Quantum booth SL8511 – where visitors can discover more information about the future of real-time cognitive processing.

Veritone’s AI operating system, aiWARE real-time framework, is scheduled to launch  in April 2018 with natural language processing (NLP) engines, and Veritone plans to expand its offering of real-time enabled engines across the rest of the company’s 15-plus cognitive categories, including transcription, sentiment, visual moderation, translation, as well as object, face, and optical character recognition, during 2018.

The groundbreaking aiWARE real-time framework will enable companies and organizations to leverage data in seconds, optimizing their workflows and compliance. In 2021, AI augmented services and applications will generate $2.9 trillion in business value and recover 6.2 billion hours of worker productivity, according to Gartner, Inc. (Predicts 2018: AI and the Future of Work).

“As more customers deploy and expand their use of artificial intelligence, we are increasingly seeing use cases where the speed of cognition is equally as important as accuracy and cost,” said Hong Bui, senior vice president and chief product officer at Veritone. “As a result of this upgrade, we’re now able to deliver transcription results within seconds, with best-in-class accuracy and pricing, and we believe that sub-second results will be possible in some other cognitive classes. The applications of aiWARE are limited only by our customers’ imaginations, as we continue to expand the real-time cognitive capabilities of aiWARE.”

“With this update to aiWARE, Veritone continues to show companies that AI is an accessible and powerful business tool,” said Ryan Steelberg, president of Veritone. “In the media and entertainment space, we believe that this new capability will help meet the needs of broadcasters who must adhere to FCC mandates for pre-recorded, live, and near-live programming, as well as enable faster and more cost-effective captioning.”

These are just some of the examples of artificial intelligence being used in different areas related to the industry. The list of companies mentioned here in no way represents the total of names associated with AI that made it to NAB 2018 to show their own solutions for the future. The products and solutions, together with the conferences and panels about AI, allow visitors to get a glimpse of what’s coming and the advantages and problems the technology brings.

]]>
https://www.provideocoalition.com/artificial-intelligence-at-nab-2018-real-world-applications/feed/ 0
How to Avoid a Data Hack https://www.provideocoalition.com/avoid-data-hack/ https://www.provideocoalition.com/avoid-data-hack/#respond Mon, 06 Nov 2017 15:00:18 +0000 https://www.provideocoalition.com/?p=62419 Read More... from How to Avoid a Data Hack

]]>
How to Avoid a Data Hack 7

As officials at Sony and HBO can tell you, it’s tougher than ever to ensure sensitive materials are being properly protected, regardless of how many resources might be available to an organization. While there was clearly a breakdown in security protocol at Sony since thousands of passwords were in a folder named “passwords”, the issues at HBO aren’t the result of a single source or issue. Both hacks provide lessons for media & entertainment professionals, and not having actual passwords in a folder named “passwords” is just the beginning.

Experts have declared that the entertainment industry is a prime target for hackers because of the money and influence that’s associated with it, and many organizations have not engaged with robust audits by third parties to help them find the gaps in their own security. What happened at Sony, HBO and even Netflix showcase why cyber security should be a top priority. They have also given production professionals of all sizes some insight around how they can avoid putting themselves and their companies through similar ordeals.

Understanding Security Vulnerabilities

The security issues HBO dealt with were the result of problems with their supply chain, insiders who knowingly or unknowingly revealed more than they were permitted to, and compromised accounts. These issues enabled the theft of a total of 1.5 terabytes of data. Hackers released Game of Thrones scripts, company documents and unbroadcast episodes of other HBO shows, including Curb Your Enthusiasm and Insecure.

To say these are incredibly value resources to and for the company is an understatement, so how exactly were these vulnerabilities able to be exploited? How were they created in the first place? Anthony Juliano, an account executive at FUJIFILM, provided us with multiple explanations around what might have happened, and why it happened.

“Just like any data breach, this one could be the result of one or many different circumstances,” Juliano told ProVideo Coalition. “It could have been the result of currently applied antivirus software that didn’t catch the breach, out of date antivirus software, ignoring basic security hygiene or repeated hack attempts by the intruder until they found a vulnerability in customer code.”

Exactly what happened at HBO isn’t the issue, but understanding the vulnerabilities that might have been exploited there absolutely is. Luckily, some of these liabilities are easy enough to diagnose and resolve. When it comes to ignoring security hygiene, users need to ensure they keep up with security maintenance. Keeping your software up-to-date is among the most important things you can do when it comes to keeping your system and data secure. Additionally, if the HBO data was encrypted, the hacker would have had access to meaningless data with no value. And of course, not falling for a “phishing” email scam is sound advice not matter what you’re doing.

There are far more involved approaches and processes associated with properly securing your data, but how much of that is necessary? Are hackers zeroing in on media & entertainment professionals in an active way? How much of a concern should freelancer professionals have in this regard?

Production Companies and Entertainment Professionals as Targets

Being able to impact how people think of or view something that’s become such a cultural phenomenon like Game of Thrones undoubtedly makes media & entertainment companies a prime target for hackers, but that notoriety is hardly the only factor at play. It’s important to remember that both Netflix and HBO had ransom demands made around their security breaches. Both of those factors have hackers across the world looking for vulnerabilities that are inherent in these systems, even if studios and production professionals are only one of their targets.

“Financial firms are high-value targets to hackers as sources of credit card information, bank account information, etc.,” Juliano continued. “However, any company that stays back-level on security maintenance is an easy candidate. Hackers look closely at new maintenance releases from companies like Microsoft, Adobe, Apple, and all major e-mail systems. They can clearly see the vulnerability and weak areas that are being fixed. That information tells the hacker where the weakness is to be exploited. They know customers are slow to apply maintenance and this gives the hacker time to attack.”

This is information that production professionals of all sizes should recognize, because it’s proof that it doesn’t necessarily matter how large you are or how valuable/sensitive the data you possess might be. If you’re using a system or process that has inherent weaknesses in it, you’re the exact sort of target that hackers are looking to find and exploit.

Additionally, many freelance professionals gain access to larger systems when they’re working on a project, and that access can turn them into an “accidental insider” which could allow someone with malicious intent to piggyback onto that access. These are the types of vulnerabilities that cyber security professionals look to prevent, but the most secure system in the world is still only as good as the people who access and utilize it.

Taking proactive steps to ensure your system or your organization is not an easy target should be a top priority for production professionals, but what exactly do those measures look like?

Pay Now, or Pay Later

Establishing specific security protocols and data processes with and for production professionals can be a difficult task. The technical logistics associated with doing so are often the last thing creative professionals want to deal with, and additional expenses related to such tasks are often not properly budgeted.

Nonetheless, there’s been a recognition across the industry of the repercussions associated with not taking this topic seriously. Security issues are not just the concern of the IT department any longer, and freelancer professionals are more open than ever to understanding what it means to stay secure. Despite that, actually installing these processes and protocols is usually not a simple or easy process.

“Unfortunately, there isn’t a silver bullet or one thing that can be done,” Juliano said. “It’s a combination of things that must happen. Studios must practice good IT and security hygiene, and that includes patching systems and applications, updating and modernizing systems/applications/infrastructure, controlling access to only those that need access, validating identities and encrypting or applying other safeguards to critical business systems. They also must implement stringent monitoring and alerting mechanisms as compensating controls for when or if an attacker breaks through their defenses.”

All of that might sound like a tall order, and for some large organizations it might be. It doesn’t have to be such a process though, since little things like updating your software and hardware exponentially increase the security of your system. Those software updates often mean little more than clicking the “Update” icon when it pops onto your screen. It’s something that many people still don’t do.

There’s a need to go far deeper, and cyber security experts can help with that process. Additionally, tried and true methodologies like the 3-2-1 rule are still effective. It states that enterprises should have three copies of backups on two different media types, one of which is kept offsite. Freelancer professionals should consider how that kind of setup could work for them, even if it’s on a smaller scale.

Data security issues often come down to a question of paying now or paying later. And paying later is always more expensive.

How to Avoid a Data Hack 8

 

Support ProVideo Coalition
Shop with How to Avoid a Data Hack 9

]]>
https://www.provideocoalition.com/avoid-data-hack/feed/ 0
Providing Defense Against Cybercrime https://www.provideocoalition.com/providing-defense-cybercrime/ https://www.provideocoalition.com/providing-defense-cybercrime/#respond Mon, 30 Oct 2017 10:00:54 +0000 https://www.provideocoalition.com/?p=60857 Read More... from Providing Defense Against Cybercrime

]]>
Given the changing landscape of the IT industry, many of the original concepts about backups are delivering additional value and are now back in style. One of those original backup concepts is the 3-2-1 rule. This rule states enterprises should have three copies of backups on two different media types, one of which is kept offsite. There are two ways to have an offsite copy – either with an online (electronic access) or with an offline (manual access) cloud. The offsite and offline copy is rapidly becoming more critical and describes what is now referred to as an “air gap”. An “air gap” is an electronically disconnected copy of data that prevents rolling cybercrime disasters from getting to your backup copies. The only way to create a physical air gap is to copy something to removable media and send that media offsite. This means tape media for most all data centers. Off-site backup and storage facility can be online or offline and can often be the most physically secure facilities in the industry. These are often built in solid granite mountains, underground bunkers or other highly discrete locations.

Providing Defense Against Cybercrime 13

You can put an electronic air gap between your backup server and backup storage by making sure that the backup is not accessible via any network or electronic connection. Most tape cartridges typically reside in library racks meaning they are offline well over 95% of the time (protected by the air gap) and are not electronically accessible to hackers.

The air gap prevents cyber-attacks since data stored offline – without an electronic access – cannot be hacked. For example, Ransomware is the latest crypto-viral extortion technique which encrypts the victim’s files to make them inaccessible, and then demands a ransom payment to decrypt them. These new types of attacks embed time-delayed undetected malware into your backup repositories sometimes taking several months to reactivate. This makes file restoration pointless because as you recover your data, the ransomware re-ignites and then re-encrypts the data all over again. This is known as the Attack-LoopTM.

Providing Defense Against Cybercrime 14

Whether you have the best backup solution, the latest anti-virus protection, or multiple versions of back up repositories, this next generation of cybercrime is evolving so quickly that those concepts seldom matter anymore. In a cloud-based backup, critical data is backed-up over the Internet and most likely stored in a shared storage infrastructure at an off-site data center maintained by a third-party cloud company providing backup, archiving and replication services.

Fortunately, Attack Loop software is becoming available and uses signature-less technology which checks and quarantines malicious code upon entry into the backup repository and again prior to recovery into your environment. Combining offline tape storage with Attack Loop software yields the greatest chance of preventing cybercrime. Ultimately though, it can become a full time job to keep up with the latest detection tools designed to protect your data. Having the best backup is okay only if you make sure you stay up with the updates.

Keep in mind that tape technology is not standing still. IT executives and both online and offline cloud service providers are addressing new applications leveraging tape for its security, air gap cyber protection, and economic advantages. Tape has expanded its position as a highly effective complement to flash and disk for the foreseeable future due to its higher reliability than HDD by three orders of magnitude, higher capacities, much faster data rates, and significantly lower energy costs. The TCO for HDD is typically 6-15x times greater than tape offering significant cost savings. Easy to use TCO tools are available to determine your specific TCO. This recognition is driving continued investment in new tape technologies defining robust roadmaps that face few limits for the foreseeable future.

Bottom line: Given the rising wave of cybercrime, the role of tape-based offline storage and cloud solutions taking advantage of the “Tape Air Gap” is back in style.

Register here for Fujifilm rewards!

 

Support ProVideo Coalition
Shop with Providing Defense Against Cybercrime 15

]]>
https://www.provideocoalition.com/providing-defense-cybercrime/feed/ 0
Easy, fast and secure media file transfer–the Bigfoot of post https://www.provideocoalition.com/easy-fast-secure-media-file-transfer-bigfoot-post/ https://www.provideocoalition.com/easy-fast-secure-media-file-transfer-bigfoot-post/#respond Wed, 25 Oct 2017 18:53:16 +0000 https://www.provideocoalition.com/?p=61481 Read More... from Easy, fast and secure media file transfer–the Bigfoot of post

]]>
Easy, fast and secure media file transfer–the Bigfoot of post 20

I have a long list of “It’s 2017, how come we haven’t figured out how to…” questions. Topping that list (alongside why Siri still can’t look up a phone number for me without mistakenly calling someone to whom I owe money instead) is the issue of transferring video files securely across the web. There typically seems to be a choice: send files easily or send them securely. The easy way is an email attachment if it’s small enough, wetransfer.com if it’s a little bigger, or a thunderbolt drive couriered across town if it’s colossal. The secure way is an expensive, industrial service that typically requires the installation of software on your client’s machine whether they like it or not. So taking all that into account I was curious to try out Signiant’s offering, Media Shuttle. This is a service specifically geared toward ease-of-use and is perfect for small to mid-sized companies (i.e. boutique post houses). It promises the simplicity that busy people demand without compromising security. Let’s take a look.

You can hold my hand, but don’t waste my time

I’m not a big fan of user guides or guided tours. I’ll waste a couple of hours trying to figure things out by myself before asking for help. (In other words, I’m a typical male.)

Having said that, I was pleasantly surprised by the screenshare introduction to Media Shuttle. Maybe I got lucky with the session being hosted by Signiant’s Tyler Cummings, but within about 20 minutes of my time (thirty counting my obnoxious questions) Tyler had pretty much shown me the entire system from start to finish.

So what exactly is Media Shuttle and how does it work? There are three main ways to move files across the internet using Media Shuttle: “Send”, “Share”, and “Submit” (the first two are for sending, the latter is for receiving to your server).

The Send Portal

The Send Portal option is a very simple email web page–think wetransfer.com. Type in a recipient’s email address, drag the files you want to send to the browser, and click send.

Now when I make the comparison to wetransfer.com it’s a pretty shallow one; there’s a whole lot more going on here than your standard WeTransfer session.

First of all, there’s no file size limit. That’s a bigger deal than it seems at first. Even with open source, self-hosted systems like ownCloud large transfers frequently fail to deliver successfully.

Secondly, you can send entire folders and subfolders without zipping them first. This is huge if–like me–you’re constantly dealing with massive EXR and DPX image sequences. A multi-layered EXR can easily weigh in at over 100MB per frame; trying to zip those suckers up first (if you actually succeed) can take longer than sending the actual files.

Media Shuttle actually uses its own proprietary transfer protocol optimized for large file sizes rather than the office documents that generic transfer services are designed for. We’ll talk a little more on that later, but in essence it means that your transfer speeds won’t suffer from being uncompressed. (In fact, media files rarely benefit from being zipped–they’re already compressed at the file level.)Easy, fast and secure media file transfer–the Bigfoot of post 21

Next, you can send with whatever level of security required. Is the content extremely sensitive? You can require that the recipient logs in securely to the Media Shuttle portal before downloading the files. For additional protection, you can require a user-generated password for download. Or do you have a technology-challenged client that wants a straight download without the hassle? Assuming the content isn’t too sensitive, you can make set up for the download to be a single click from the email.

Another nice touch is the ability to customize the look of each portal page. This can make your one or two man shop feel a lot bigger and slicker than it really is. (After all, the client never needs to know that you finished the conceptual art for their campaign in your bathroom on a laptop.) Unique pages can be created for different clients or projects, at no additional cost.

The Share Portal

Now I’ve worked on small campaigns and big Hollywood blockbusters, and I’m amazed at how often proprietary information just gets thrown on an unsecured FTP server for me to download. This is especially true in the last two weeks before picture lock on a movie; studio execs with little tech savvy will find the path of least resistance when they need to get files to a post house during crisis hour. Security goes out the window and production staff find FTP servers an easy way to give vendors instant access to a set of files.

Media Shuttle offers a no-nonsense “FTP-like” interface to entire folder structures, so a client can browse and choose the files and folders they want to download. But it’s not FTP–it uses 256-bit AES encryption system, the security scheme recommended by the MPAA.

Security is, of course, paramount, and FTP is an inherently insecure way to share files. SFTP is better, but typically frustrates non-savvy users who have to configure an FTP client to the correct protocol.

As the host of the files, you’re completely in control of which folders a given client can have access to. Permissions are easily configured from the portal management page. On numerous occasions I’ve been frustrated trying to set permissions on SFTP servers only to end up either locking out the client or showing them too much. The Media Shuttle configuration is blissfully simple in comparison.

The Submit Portal

A third transfer option pushes files to your storage. This is basically a drop box that allows a client to send you a file (and add an optional message). Again the emphasis here is on simplicity: drag files or folders to the browser and you’re done. To benefit from Signiant’s advanced transfer over UDP, the Signiant app needs to be installed (it’ll be automatically triggered by the browser page).

The only thing I would have liked here is an option to automatically generate a timestamped folder name for incoming files (e.g. incoming_05_31_17_1500) to help keep versioning straight under deadlines, but again this starts to add custom complexity. That kind of thing can always be handled via a local watch script sitting on the server anyway.

Simplicity beats complex efficiency

I feel the need to digress for a moment on this point. I can’t emphasize enough how simplicity trumps level of customization. The obvious example in the world is Mac vs. PC. But I’ve seen it time and again in the post-production world. Complex systems break down in crisis. And production staff are great organizers but not always the best technologists. They’re also way too busy to spend a day in a seminar learning how to use a proprietary digital asset management system.

So even though Media Shuttle emphasizes simplicity, it no doubt fits well with the larger studios. I remember when we first implemented Shotgun at my studio: for the first three months I had a PA translate the reports and enter data for me because I simply didn’t have the downtime to learn how to use the interface. Media Shuttle does its job and hides the complexity. No need to write a thousand lines of python scripting just to get files moving.

As a quick note: APIs do exist for larger studios looking to implement Media Shuttle with their DAMs and MAMs; they’re just not necessary for taking advantage of Media Shuttle’s file transfer services.

Managing the system

Easy, fast and secure media file transfer–the Bigfoot of post 22Configuring a file transfer service is typically a nightmare for those of us who don’t spend our lives grepping, piping, and rebooting Apache services on Linux servers. What surprised me as I went to work on Media Shuttle is the fact that I didn’t need to pull up a single help doc or call for tech support. Let me give you an example workflow:

To add a client to the Share Portal (the “FTP-like” web page) I choose “Person to/from File System” from the choice of portals at the top of the management page.

Then I select the members tab and click Add. I type in their name, email address and the expiration date–the date I want their access to the site to cease. By default they have access to the entire server. But if I simply click the Change button, I can choose a specific directory level of limited access. Finally I check the boxes for upload access, download access, and edit access and I’m done. The client is now able to access the relevant files and only the relevant files.

There is a help section at top right of the page that explains options in plain English. At no point did I feel like I needed to be a Microsoft Certified Systems Engineer to manage the system. I’ve had a harder time configuring my home wifi router.

You can create as many of these portals as you like to help partition work between clients, or tasks (e.g. VFX vs editorial). So the system can be extremely flexible, even while keeping the complexity to a minimum.

Hosting

Just to be clear about one point: Media Shuttle is a transfer service, not a hosting service. They don’t own the storage your stuff sits on. Who does? Well, you do probably. Media Shuttle works either as an application sitting on top of your existing Linux or Windows file server, or in partnership with a cloud-based hosting service.

The only bummer here is lack of OS X server support. While many studios have switched to some kind of Linux file server, there are plenty of small creative boutiques still serving their files from a Mac Mini, so that’s something to consider when switching to Media Shuttle. When I mentioned this to Signiant they recommended using a virtual machine like Parallels on the host server. Seems like that would work just fine; data transfer to/from the outside world would be a bottleneck long before the overhead of the VM became an issue.

As far as configuration goes, Tyler from Signiant assured me that the install was painless and that if any complications arise Signiant’s tech support can walk you through configuration in a matter of minutes.

For my evaluation I took advantage of the cloud-based hosting option. Media Shuttle is ready to work out of the box with either Amazon’s S3 offering or Microsoft Azure. In the case of cloud hosting, Signiant attaches to either of the services. The process and offering is the same, albeit with the addition of a monthly payload allotment to account for cloud service transfer fees. I can’t speak to the locally hosted experience (although I have a feeling I’ll be implementing it sometime over the coming year), but the cloud-based service behaved flawlessly.

Security and transfer rates

I am not an internet security expert, nor a dark web hacker, so I can only take Signiant’s word as to the security of their system. However, having previously endured the process of MPAA certification at my own studio and having been involved in security audits while working on the lot at one of the majors, I can attest that Signiant’s transfer process being in line with MPAA recommendations is no small thing.

As far as transfer rates, in my very informal testing I found them to be “alarmingly fast.” How fast is that? Well, 2.1 GB of EXR files took 52 seconds to download from the Amazon host server. 20 seconds of that time was file prep and handshake (which I imagine would be pretty consistent even on larger pulls), so that puts the actual transfer speed at around 525 Mbps. Now my connection on speedtest.net measures at around 370 Mbps. How can Media Shuttle be that much faster? I have to assume it’s because speedtest.net is measuring based on highly-redundant TCP packets, while Media Shuttle is leveraging its proprietary protocol sitting on top of UDP. Whatever the case, it’s fast. (In a very unscientific comparison test, 150MB of EXRs took less than 10 seconds to upload–including the handshake period at the front end, while the same files took over 3 minutes to send via the free wetransfer.com service.)

What’ll it cost me?

When enterprise-style solutions are discussed in the post-production world, most people assume that “if you have to ask the price, you can’t afford it.” As we’ve mentioned though, Media Shuttle is perfect for boutique studios, so the price obviously needs to work for that market segment. The “professional” packages begin at $7,500 a year but there’s special pricing for companies with fewer than 20 employees. Everything is based on the number of active users per month; there’s no limit to how much data you’re permitted to transfer. The Signiant definition of an “active user” is someone who sends a file or receives three or more files in a given month. These are floating user licenses, so you can have as many users as you like in the system (e.g. multiple clients) and in a given month only those users that meet this definition of “active” will be counted against your quota.

If you’re just getting started and running a lean ship, that might seem a little steep. But one thing that was clear to me from my conversations with Signiant: their pricing structure isn’t set in stone. Signiant is eager to make the service work for smaller companies (under 20 employees) and will endeavor to design a package around the needs of the client.

Overall, I’ve been impressed by the flexibility and transparency of the pricing structure. Want to create separate portals to the physical storage at your studio and a different setup at your house? No extra charge. Have media storage pools distributed between offices in New York, Los Angeles, and Sydney? Create a unique portal for each. What about local storage for team collaboration and an Amazon S3-based portal for clients? Again, no additional hidden fees.

Of course if you’re trying to court bigger clients, even the $7,500 annual base could end up being a trivial cost in comparison to the payoff. Larger client businesses will instantly balk at unprofessional work practices and media transfer is one of the first encounters a new client will have with you in this connected world.

Conclusion

As you’ve probably already guessed, I was impressed with the Media Shuttle ecosystem. It’s rare that I come across enterprise level software that has as much intuitive usability. With the caveat that I’m yet to test the system on my locally-hosted file server, I’ve found little not to like with the service.

Boutiques already well established will instantly see the value proposition here. The alternative is to run your own file services. This might seem like a good (cheaper) approach out of the gate, but once you add configuration, IT costs, lack of security, and the sheer amount of time spent trying to manage permissions and maintain everything, $7,500 a year can be a downright bargain.

Visit Signiant’s Media Shuttle site for more information.

]]>
https://www.provideocoalition.com/easy-fast-secure-media-file-transfer-bigfoot-post/feed/ 0
Meet the alternative to Adobe: Skylum https://www.provideocoalition.com/meet-alternative-adobe-skylum/ https://www.provideocoalition.com/meet-alternative-adobe-skylum/#respond Wed, 25 Oct 2017 17:02:49 +0000 https://www.provideocoalition.com/?p=61780 Read More... from Meet the alternative to Adobe: Skylum

]]>
Meet the alternative to Adobe: Skylum 24

Macphun has a vision: to offer photographers a truly worthy Adobe alternative. This was revealed today by its CEO, Alex Tsepko. Now CEO of Skylum Software.

Macphun started seven years ago building apps for the iPhone, but the company soon found its true call: photography. In fact, although for the first three years they launched close to 60 applications, many of them having no relation to photography, it was apps like FX Photo Studio, Silent Film Director, and Perfect photo that offered the company its biggest successes, with more than 20,000,000 downloads.

Early in 2011 the company launched the first Macphun photo software on the Mac App Store – FX Photo Studio Pro. A year later, the first recognition from Apple – Snapheal was named among the Best Apps on the App Store – started a trend. Since then, Macphun products have been getting this (and similar awards) non-stop. Alex Tsepko  says this:“I believe Macphun is the only photo software developer in the world to hold this recognition for five straight years in a row. We also have more “Apple Editor’s Choice” awards than most app developers on a Mac.”

After the initial apps Macphun made a name for itself, driven by the idea of helping photographers on the Mac to make, continues Alex Tsepko, “stunning photos in less time”. The CEO of the now renamed Skylum says that the company continues to be driven by the same idea, but now “in a much bigger scale”.

This year, as ProVideo Coalition noted, Macphun, now Skylum, launched its products for Windows, expanding its client base to a a new universe. This happens at a time when many photographers look for alternatives to Lightroom, a situation made even more urgent with the announcement that Adobe has decided to end the perpetual license for Lightroom. For many users it’s essential to find alternatives, and Macphun, which used Adobe’s announcement to promote its Luminar and Aurora HDR programs, decided it was time to take an extra step, The announcement made by Alex Tsepko of the name change and the plans to offer photographers a truly worthy Adobe alternative is a bold move that will, no doubt,  catch the attention of the market.

The release of the most advanced versions of Aurora HDR & Luminar for both PC and Mac photography communities confirmed Macphun that they had found a new market. According to its own CEO, the growth of audience this year reached a level that is larger than the past three years combined. So, Macphun believes it can go from there and challenge Adobe.

Alex Tsepko says that ”we feel we are among the few companies who can achieve this goal. Adobe is a fantastic company and a well-deserved industry leader. We admire them and our products work within their architecture as plug-ins. But we also have a great team, our own proprietary technology, and the community support to make a dent in that Universe. ”

The name change is a rather interesting move and one that was bound to happen. When the company first announced they would launch software for Windows, I asked them if it wasn’t a good idea to create a company named Winphun. Apparently, they decided to follow my advice, and take it further. Now, for Macphun, the sky is the limit. Gradully the company we now know as Macphun will become Skylum Software. And as Alex Tsepko writes in his post,  they are not yet an alternative to Adobe, by they want to be, and ”we are just getting started…”

]]>
https://www.provideocoalition.com/meet-alternative-adobe-skylum/feed/ 0
Lightroom: the death of the perpetual license https://www.provideocoalition.com/lightroom-death-perpetual-license/ https://www.provideocoalition.com/lightroom-death-perpetual-license/#respond Sun, 22 Oct 2017 17:17:48 +0000 https://www.provideocoalition.com/?p=61625 Read More... from Lightroom: the death of the perpetual license

]]>
Lightroom: the death of the perpetual license

Adobe’s subscription model is not for everyone, and the recent changes of policy force many photographers to look elsewhere for alternatives to Lightroom, which perpetual license model just died.

Lightroom 6 is the last version of Lightroom with a perpetual license. There will not be a Lightroom 7 with a perpetual version, and the only thing owners of the program can expect in term of updates is a final update, Lightroom 6.13, available October 26, 2017 which will make the program compatible with the NEF format from Nikon D850, something that users have asked for quite a while. That’s the end, according to Adobe.

The end of Lightroom without a subscription goes against the promise made, May 2013, by Tom Hogarty, from Adobe and published on the company’s blog. When asked ” Will Lightroom become a subscription only offering after Lightroom 5?”  Hogarty replied this: “Future versions of Lightroom will be made available via traditional perpetual licenses indefinitely.” For photographers who have not subscribed to any CC plan, this  change in policy is, apparently, the end of the road, and many are already looking for alternatives, if they have not picked one up already. In fact, while a decade ago Lightroom was both a newcomer and king of the RAW editors segment, its position is far from certain now, as the market has evolved to multiple alternatives, no matter what Adobe says.

Lightroom now comes in two flavors, botth under subscription, one the Classic, which keeps everything you know from the previous CC version, and a new one, built with the web in mind – and created to solve the speed problems of the actual Lightroom – which does not offer the DAM (Digital Asset Management) options that made Lightroom so interesting for many. With the new Lightroom CC your files will be online, at Lightroom servers, always accessible. No more catalog or collections, but the need for a high-speed connection to the Internet, wherever you are, is a must if you want to use it. And you’ll have to pay for space for yout files, at $9.99/monthly per Terabyte.

Let’s leave to others the discussion of how viable – and costly – this solution is. Lightroom moving to a subscription only service, does not mean you’ve to give up photography altogether. The words Lightroom and Photoshop may still remind us a lot how the word “Kodak” meant “camera” decades ago, but in fact many people use other programs to edit their photographs. Alternatives to Photoshop always existed, and the launch of Lightroom, a decade ago, helped to open a new segment of the market, with similar solutions.

Photoshop and Lightroom also created the base for the development of plugins, software that extends their functions, and, not surprisingly, it’s from many of those plugins that have appeared the programs that are, in modern days, alternatives to both of the solutions from Adobe. They are out there, waiting for you to discover them.  From Topaz Labs, which has its own completely free editor, to the sophisticated ON1 Photo RAW, which became a full-fledged program, mix of Photoshop/Lightroom and all the plugins you may need, the old plug-ins became independent and able to offer everything that is needed for a photographer’s workflow.

https://youtu.be/GlAYvjlA1bE

Many of the new programs have been mentioned here at ProVideo Coalition, and I’ve not covered all the options, because there are more solutions than the names I am most familiar with. From Corel to ACDSee, from Capture One Pro to the free RAW Therapee, or LightZone, there is a solution to fit the needs of different photographers. For those who need a DAM included or to work with their photo editor, some of the new programs offer some form of catalog. One I’ve tried briefly in 2017 is from ACDSee, and they have the new ACDSee Photo Studio Ultimate 2018, which mixes a complete DAM solution with a very capable photo editor.

Lightroom: the death of the perpetual license

Last July, in one article about 10 years of Lightroom and the available alternatives, I wrote that if you did not need or like Lightroom’s catalog there other options available. Well, now I can write that, besides the DAM alternatives available, like ACDSee’s, and the promised DAM from Affinity Photo, Macphun also announced that its Luminar RAW and photo editor, available for mac and soon for Windows, will include a DAM module in 2018, and it will be free to owners of the Luminar 2018 version, which costs $69.

One should also remember that if you do not need a catalog similar to the one present in Lightroom, ON1 Photo RAW 2018 offers a browser that is fast and, when used with the option to index folders, makes for a reliable way to use keywords, colors, stars and other functions to manage your files. So, there are other ways to build your workflow.

Adobe says that the end of Lightroom with as perpetual license is the company’s answer to a reality: “Customers are overwhelmingly choosing the Creative Cloud Photography plan as the preferred way to get access to Lightroom. We’re aligning our investment with the direction our customers have signaled over the last several years.”

Visiting many forums online, and reading the comments published, it seems that not all the customers are choosing the Creative Cloud Photography plan. If you add to those comments the fact that many of the alternative programs base their marketing on the “no subscription needed” motto, and they seem to thrive, then Adobe may be throwing away a huge part of their client base. Maybe they don’t care, as now Adobe created ways for professionals to pay to keep their images in their servers. At $9.99 per Terabyte, monthly. Added to the cost of the software… or service.

]]>
https://www.provideocoalition.com/lightroom-death-perpetual-license/feed/ 0