NZVN October 2016

Page 1

OCTOBER 2016

IBC 2016

Vol 229

did not disappoint for innovative products on show. Occasionally, one wonders why some have stands, but not this year. At more than one interview, I had to refocus on what I was being told, as my brain struggled to take in the really clever developments being presented. It almost makes paying $500 per night for an average hotel worthwhile, but it did mean packing everything into 2 days of intense interviewing. We bring you ARRI first of all, but they were not the only ones to have wowed the crowds and me alike. Read on. Ed

ARRI Cameras & more We are on the ARRI camera stand with Forest Liu. Ed: So Forest, you’re the camera person for the AsiaPacific region and if Brett Smith has any questions, he comes to you? Forest: Yes, pretty much. Ed: Anyway, we’re here at IBC and Forest is going to show us some new things that have happened with ARRI in the camera line since NAB. Forest: We see ourselves as a tool manufacturer, so we want to build the tool that is best for cinematographers and creative filmmakers who want to create art, create images that really inspire people, obviously entertain people as well. To do that, we go about it saying “okay so what technology do we have, what’s available to us and how can we make something that is useful and valuable to our users?” We’re not so much thinking in terms of “oh there’s a price range where we could make some money” – it’s very different. We really want to create this for them because they’re the filmmakers that we aspire maybe to be at some point. Ed: And in that vein, the new product line, the SXT … and what’s that for the uninitiated? Forest: Basically, the ALEXA came out back in 2012 – this is our first kind of bigger digital camera that we released. We did have the D20 and the D21 before that,

Forest with the ALEXA SXT.


but they were pretty experimental, they were kind of not ready yet for the time. I think we weren’t quite there yet. Ed: The less said the better on those ones. Forest: Yes let’s not talk about those. So with the ALEXA we got pretty much everything right and it is proven now that, 6 years later, this is kind of the standard camera in filmmaking, in television and we’ve now gone through 3 generations of it. So we had the ALEXA, then we upgraded to the ALEXA XT which brought RAW recording into the camera and a couple of days ago we started shipping SXTs which is the 3rd generation of the ALEXA camera. Ed: Right, so the full name is the ALEXA SXT? Forest: Yes. SXT stands for Super Extended Technology. Ed: How can you go further than “Super”? Forest: I don’t know yet. We haven’t really released anything on the next generation of ALEXA. Ed:

Because you’ve got Superman so how can you …?

cameraman, the operator, the DoP, the AC on commercials probably the client, totally different looks and information on the screen. So whatever is useful for them, they get. In the past, if you were on a commercial shoot, you had the client, they wanted to see vivid contrasting colours and it has to look great, but they don’t want to see all the little lens data and all that stuff. Ed: You don’t want to show them a RAW output? Forest: Exactly, so at the time, you had to decide who got what, or for whom would you set up the monitor out, and now you can have it with 4 different ones. You can give everybody the exact look that they should get and the information they should see on the screen. So it’s very cool and I think a lot of people are going to start using that and demand it. Ed: But there’s more. Forest: There’s more … what has happened with the SXT is that we’ve taken the colour processing and the 3D LUT support that we have with the Minis and the AMIRAs, we’ve added it into the SXT, so they’re now all

Forest: I guess you could go to … Ed: Oh, Superwoman? Forest: Superwoman, why not … and I guess with ALEXA it is a female name, so perhaps that is true. Ed: Okay, so for all the girls out there, what have you got? Forest: The SXT brings larger capacity media, so it’s a new media bay on the camera. This will allow you to use the new 2 terabyte, 1 terabyte cards, but also all the old ones, so the 521-XR cards that we had on the XT, the SXS cards and the CFast 2.0 cards that we use on the AMIRA and Mini. It does require a little adapter piece for each of the different media, but it’s backward compatible all the way to the original ALEXA cards except for the smaller capacity ones at the time. Ed: So you’re saying now that there’s no reason to continue any relationship with Conex as an external recorder device? Forest: With the XT, already the Conex external recorder became internal and we continue that. So the SXT again still has Conex technology inside internal, but now with the ability to have higher capacity SXR capture drives. Ed: Right. So in terms of actual visual difference, it pretty much looks the same? Forest: Yes, there’s hardly any difference. The media bay is different but it’s hard to recognise that from the outside. It would require you to see the label on the camera to be 100% sure that it’s the new SXT. But the internals are completely new. Almost everything except for the sensor is new inside the camera and it’s based on what we learnt building the ALEXA 65, so it has the same processing power as the ALEXA 65 and that’s very powerful – really it’s a computing device inside and what is very cool, and a lot of people underappreciate this, is that it has 4 different, completely independent monitor outs. What that allows you to do is that you can give the director, the

You will see more about this beauty later.

Go to https://sites.google.com/site/nzvideonews/ for more news.

Lots of treats from Amsterdam in this issue. Page 2

DISPLAY & CLASSIFIED ADVERT BOOKINGS BY WED 2 NOV ADVERT COPY BY NOON FRI 4 NOV UP ON THE WEB BY FRI 11 NOV


fully compatible with the new ALF-2 file format. What that really means, and I think that’s also where we’re very different from many of the other manufacturers, is that we have a lot of different camera bodies, different sizes, designed for different types of shooting with the AMIRA shoulder mounted, this larger crew, multi-people kind of use camera and then the Mini with special functions, but all the images are compatible with each other, you know exactly from the same sensor, same kind of processing, either ARRI RAW Pro-Res … Ed: So in other words, in postproduction you could take images from any one of those and give them a very similar look? Forest: That’s exactly what it is and you can do it on set with the same 3D LUTs. So if you design a 3D LUT for the SXT, you can also use that same LUT file on the Mini. But it’s more powerful than that. If you think about it, when we had like a main camera XT or SXT and we needed a smaller camera, all of a sudden it was a different camera and it looked different and you had to spend a lot of time grading it and getting them to match up properly. Now you can take the Mini, you can take the AMIRA even and they all match up together, so you can always use the camera body that best fits your shooting requirement, and you don’t have to worry too much about having a different image quality. The ALEXA has always been very good for HDR imaging and now with these new 3D LUTs we can actually have an HDR 3D LUT in the camera that’ll give you ( if you have an HDR monitor ) a perfect HDR monitor output. Very beautiful stuff. Ed: You know what HDR is do you? Forest: Yes, High Dynamic Range. Ed: Aaah. Right now here we are, we’re explaining that HDR is different if you’re talking cameras or if you’re talking monitors. So if you’re talking monitors, it’s a bit of a mess out there and nobody really knows, or everyone has a different definition of what High Dynamic Range is, but if you’re talking cameras, ARRI has its own definition which is …? Forest: Well it is how much data or how many stops can you get between the brightest and the darkest part of your image and kind of the entire range. How much of this range and detail can you still get in the brightest and the darkest parts? And then, when you’re doing the monitoring, because everybody has a different standard, you have to grade to the different standards, so even for us, the 3D LUTs that are then in the camera that are HDR, they have to be graded for the monitoring device. But because we’re capturing all that data and the detail and the full range, it’s very easy for us to regrade that to match whatever output monitor you have.

has now come to a point where everybody has recognised that. Ed: And another really important feature of the ALEXA SXT? Forest: We have 4K recording, both for cine and for UHD in the camera, so you can have a ProRes UHD or a 4K cine file generated right in the camera recorded on your media. So if you have a 4K workflow, you can start editing right away … I mean, it’s upres in the camera, so no postproduction needed to make it 4K ready. And on top, also for anamorphic, we’ve added 6x5 recording mode, so that you can get a finished 2K or 4K anamorphic already desqueezed in the camera. So this should really help a lot of people shoot faster. Basically, we’ve thought about what are all the different formats that are required in distribution, and we went back from there to create the right file formats and resolutions in the camera, so that this resolution is faster than ever. Ed: Fantastic. Now, some things that you’ve added across the range, but starting with – I guess this is not specifically for, but particularly useful for the AMIRA? Forest: For the AMIRA and the Mini, we’ve released the CCP-1. It’s probably going to ship later in the year, around November. It is the interface and the screen that used to be on the viewfinder, the MVF-1, of the Mini and the AMIRA, separately. It can be daisy chained both on the Mini and the AMIRA so that you can get the interface on either side of the camera. So if you have both the viewfinder and the CCP-1, you can have the assistant on the right side have the interface, have the menu, while the operator himself can also use it on the left side of the camera. It’s similar in a way to the use of what the ALEXA is or has always been. But theoretically, I think some people may choose to not even get a viewfinder if they’re used to shooting with a small screen and have the menu and the options on top in that way. So that’s a new product, but it is daisy chained. Ed: And now a fancy grip that has wider applications and this is a product called Master Grips? Forest: The Master Grips come in a … what would you call it? Ed:

Form factor?

Forest:

Form factor, exactly …

Ed: I’ve got all the words, it’s just getting them in the right order!

Ed: Okay, so how many stops is ARRI’s definition of HDR? Forest: Our sensor, even starting with the original ALEXA, so even old ALEXA footage that was shot 6 years ago, could be remastered for today’s HDR monitors. None of the other cameras back in those days had even close to this ability. So that’s something that in many ways we were futureproof back when everybody else was “oh we need higher resolution.” Actually, HDR was important and, at the time, we decided that we needed the highest overall image quality and HDR is a big part of that and I think the industry Page 3


Forest: Well Grant, you’ve been in the industry for too long now. So yes, the form factor is based on the traditional hand grips we always had with our film cameras, and they’re much loved by a lot of operators. So we decided to use that form factor and then add digital features to it. We’ve added buttons, a rocker and a wheel that can be configured to control motors – the servo motors that you see on ENG lenses, but also the motors that we have for our ECS range of products, which is kind of cool because now the operator can have a shoulder mounted camera and can do zoom, focus, iris, start-stop, even some of the camera settings right from the hand grip without having to remove their hands from the hand grips. Previously, you really only had "start-stop" at the most and now you have so much more in your hands. Ed: So this is going away from the ALEXA concept of having a large crew; this is now providing a greater range of control to a lone cameraperson? Forest: Exactly and I think what we’ve learned is that it’s more comfortable to have the hand grips at this lower angle and many of the ENG cameras always have this kind of L-shaped arm grabbing the servo on the lens and it’s not very comfortable. So traditionally, film people have operated in this way and I think even in image quality and everything, there’s a kind of a coming together of the film and television people. Movie people need smaller crews and want to shoot lighter, but also TV crews want to shoot more and more

in a cinematic way and the Master Grips really combine the best of both worlds. Ed: And any comparison with the form factor of the Sony FS7 is grossly unfair? Forest: Well yes, I think so. Ed: Right, we’re going to go on and talk about a quite revolutionary addition to the range with … what’s it called? Forest: We call it Multicam. We support the Sony simple protocol through the Ethernet port on the AMIRA

The Multicam controller.

and it allows us to use traditional Sony RCP remote control panels to control iris, colour, all of this, on the cameras, like in a traditional Multicam kind of studio or OB van setup. It can be expanded to offer fibre with products made by DTS, Multidyne, Celcom, so it’s compatible with a whole bunch of these fibre options. But what it allows us to do, and we’ve seen this now with shows like I am a Singer in China, and this is big, you have to imagine it’s like The Voice or other big, big shows … they’re using this way of shooting. So they can shoot and have all the control in the control room for the cameras, for the iris, for all those functions. With the higher sensitivity of the AMIRA, better colour rendition, and then with the use of cinematic lenses, you can get a cinematic look on this kind of live TV production which was never possible before. We believe this is going to revolutionise how at first the bigger TV shows are going to look. Ed: Now we bring in Marcus Duerr to explain more about Multicam. This really is moving ARRI cameras into the television area isn’t it? Marcus: It is, and that’s the purpose that, for the one part, we allow productions in the television environment, like live productions or multi-camera productions in general, to benefit from our superior image quality so that they can bring that image quality into these kinds of productions as well. At the same time, it’s another opportunity for our customers to leverage the cameras in a larger variety of production types. So for a rental or a production company, they can use the camera not only for single camera productions, but also now for multi-camera productions. Ed: I would say for stage shows, using this is just a no-brainer?

Page 4



Ed: Now Forest, we have come to a totally new product from the ARRI range which is? Forest: These are stabilisation systems … Ed:

our

camera

What, there’s more than one?

Forest: Yes. They’re made up of 3 product ranges – the Artemis which is the traditional type camera stabilisation equipment; and the Maxima range which is a 3 axis gimbal, a very cool product because it does up to 30 kilograms of payload, so that’s very different and the length of the camera has no limitation. So it’s very easy to find balance. Ed: So have you come to an arrangement with the makers of the Artemis and the makers of the Maxima and combined those 2 along with a bit of ARRI technology to come up with this totally new and improved system?

Marcus (left) explains Multicam.

Forest:

Marcus: Stage shows, music productions, these kinds of things, certainly TV shows, TV soaps … Ed: So what’s different … I mean in the past, I’m sure you could have got together 4 AMIRAs and used third party hardware to join them all up and still do a stage production using them, but what have you added that makes this even better? Marcus: Yes you can certainly do that, but you’re missing then one important function and that is what is crucial for a multi-camera production, and that is that you can remote control the cameras from the control room, you can adjust image parameters and, most importantly, you can adjust the iris. Typically the image engineers that do that are used to a certain interface and that is the Sony remote control panel; and typically that is the only remote control panel. So what we added to the AMIRA functionality is an interface that allows remote control. What you then need to do is connect them both with some kind of transmission system and there are several opportunities to do that. You could do that in a wireless way if you want, but typically, you would use a fibre cable which would also then do the power supply for the camera and in that case, what we’re showing here is the solution with PTS who provide a fibre back end and a fibre CCU and all the environment needed to transport the signals from the camera and the control data to the camera with a single fibre cable. Ed: And that’s really it, it’s that level of control that was not available from third parties previously? Marcus: Exactly that is what you don’t get. I mean, you can remote control the camera with a BAP interface, that is possible since they run with AMIRA, but that is not what image engineers are normally used to when they work in a multi-camera production. They need tools they are used to and they need to be quick and they need to be reliable and so they need a known interface and that is what they have with the Sony remote control panel. Ed: And there’s nothing that changes with the AMIRA – the camera itself is just your bog standard AMIRA, it’s just suddenly that you add on if you require this functionality? Marcus: It is part of the standard AMIRA software, so it’s not a special hardware, it comes with every AMIRA starting with SUP 4.0. So there’s no problem to do that with any AMIRA, if you update it to at least SUP 4.0.

Then you get the Trinity …

Ed: That’s very Catholic of you. Forest: The Trinity is taking traditional stabilisation and adding what a gimbal is. So it’s putting the 2 together and many people are calling it kind of a Steadicam 2.0 or a second life of this kind of product. But if you see Kurt demo it, you’ll understand just how flexible it is and how many new different types of shots you can get that, in the past, either required a lot of prep or time in between, or you had to do 2 takes. Now you can do it all in one fluid motion, so it’s very cool stuff. Ed: Now I know with any of these systems, even the simpler ones, unless you’re trained in using it, it can be a dog’s breakfast, so is ARRI coming to the party here and offering training to purchasers of this gear? Forest: Absolutely. So probably for over a year now we have the ARRI Academy actively working on courses – and not just for stabilisation system use, but we’re planning to do kind of a 4 day, 5 day course on this, depending on what the level is. People who already have experience with stabilisation systems obviously don’t need as much training, but we have it for beginners all the way to advanced users. Ed: Now the man himself, Kurt … Kurt: I think this is a really dynamic, dramatic point of use. It’s the very first time that you can use a mechanical stabiliser to do "over the shoulder", because I can get the rig very high, but I can tilt down the camera. The other thing I can do, because I’ve got 2 axes more on top of the stabiliser, is that I can look around the corner. That means I don’t have to go into the room to look into the room; I can stay outside and look just behind the door, get the actor out – things like this. I can shoot even inside a car – just sticking the camera in through a window, I can cover a dialogue in the car and then I pull back and get, for example, one of the driver or the passengers out of the car. I was not able before to do this with a regular good old mechanical stabiliser, so this combination, because you get 2 more axes, gives you much more creativity. And of course, because there’s a brushless gimbal, even if I get a windy situation, I’ve got always a rock steady horizon, headroom is clear, and the other nice thing is I can even give part of the control over to the DP because he’s using the wireless remote. So for example, if we want to have a little Dutch angle, the DP can find the right moment to move the camera.

Page 6


Kurt: It’s a bit of a change. I think it’s like you would put an aircraft pilot into a helicopter – it’s still flying, but you have to think a bit differently. Ed:

There are a lot more electronics involved in this?

Kurt: But you don’t feel this, not at all. We made it that way that it’s absolutely intuitive, you don’t have to worry about the computer … this is really a nobrainer to use it. The other thing is what is new for the operators … you have to understand that sometimes you have to use a joystick to control headroom and you cannot tilt the post anymore. So we’re moving control from the left hand ( in my case ) to the right hand, to a joystick. So this is a new way of thinking, but it’s interesting if you have young operators – for them, it’s easy to adapt to a joystick control. For the older guys, it takes a week. Now the other thing that is different is again you have a new tooling to tell the stories, so to design the shot is a new approach, because you’ve got new possibilities. So it takes a while to find out what is really possible. Ed: So this is where the director of the photography on the shoot needs to pass that on to the programme makers to say “look we now have more possibilities that we can offer you”? Kurt: Yes, that’s the point – just to realise what is now really possible. Ed: I guess the other thing is that, with cameras coming down in weight over the years, the stabilisation operators are thinking “wow this is going to get easier for us, because everything’s getting lighter” but now you’ve added more hardware to the stabilisation side – has it become more onerous, is your time that you can spend in the saddle reduced?

Kurt shows one of many angles possible with Trinity.

Ed: So in other words, you can have a discussion before the shoot with the other people on the shoot and you can say “right, I’m going to do the move, you look after the camera”? Kurt: I think even for the DP this is a new way of working with a stabilised operator because we can share the job. Ed: So it’s not always your fault if something goes wrong? Kurt: This is good isn’t it! There’s another interesting aspect about it – you can do really long “one takers” because I can give you so many angles in a row which are so interesting that I think now DPs will work much more with a third camera operator together to create and conduct a shot. Now the other interesting part is that I can take the head off and I can use it as a handheld gimbal. You can take the thing up and put it in the rigging and you just use the wireless and you use it as a remote head. Even as a second unit, I can give the production way more tools. Ed: But it’s knowing all this. I know one of the biggest issues with any camera stabilisation arrangement is the learning, is understanding how to use it properly in the first place. You’re obviously right at the top of the field with the Artemis, so for somebody who is in that space who is a good operator of stabilisation equipment, is it much extra training to learn all these new parts?

Kurt: No, if you look, it’s not really more heavy than any other rig because we removed a lot of the classic parts. You see there is no monitor anymore, so there’s already 2½ kg gone. So adding 4½ but I remove 2½, so in the end, we’re just shifting weight that’s all. But it’s not too much extra weight because my new monitor is now sitting here at the gimbal, there is no monitor down there anymore. So the weight thing is not a big issue, and of course, a camera like the Mini is perfect because this is all about agility – we want to have a live camera setup, we want to be fast because we can go quick around a corner, we can go very quick from high to low mode. Ed: And, as you say, it’s a very flexible package that you can use it in many forms. You can just use it with the brace and the Artemis if you wanted to, but you’ve got all these other parts to it that can be used either by the operator or, as you say, you can put it on a crane and it just gives a whole new tool box? Kurt: Yes, I think this is what we have to understand today. You can’t only show up on a set as a stabilised operator; we should be able to offer way more to the production. So now I can be the second unit as a handheld stabiliser; I can take this thing off, go into a car and immediately I can give you a shoot out of a car. In the old days, to get a stabilised system into a car is like a 60 minute thing, just to get the hard mount in, the spring arm. Now I just take off the head, take a sandbag, here we go and everyone is happy, because we don’t waste any time, but still the production can work with me with an experienced cameraman, because that was the big threat some years ago – we have been replaced by a cheaper gimbal and very young operators. They might not be skilled enough to run an expensive production and some of the producers were crying because they would love to have back the good old guys, and I think this is maybe the right tooling for both sides – hopefully. NZVN

Page 7


Kino Flo for PLS For PLS, we are at Kino Flo with Frieder Hochheim, founder of the brand. Ed: We all know Kino Flo for Diva-Lite and I was going to say "well, we’ve done those", but you’ve just dropped a bombshell? Frieder: Yes. What we’ve done is we’ve banked on some of the iconic products that we’ve had and we’ve moved them into the 21st century. So on the Diva -Lite, a very popular product, very well known, we’ve now taken that into the LED realm. Last year we introduced it - a cosmetic light from 2700 Kelvin through 6500. This year, very much the same instrument with new software means that we’ve extended that range, and the extended range offers you full colour and Kelvin range. So the Kelvin range now goes from previously 2700 down to 2500, with a high of 9900 Kelvin. In addition to that, you can go into a gel mode; you can select either from a list of custom gels or custom settings that we’ve determined, such as candlelight, such as sodium vapour, such as mercury vapour, cool light, warm light, some of those challenges that you meet when you walk onto location sets and you go “well how am I going to match that kind of a colour?” Then in addition, we’ve added gels, so some of the more common gels that you’ll find from the major gel manufacturers we have their common descriptors from a list that you can choose from. Then we add another feature where we can offer hue angle and saturation. At full saturation, if you go into your hue angle mode, you can dial in 360 degrees of colour. So any imaginable colour, saturation, tone you want to get, you can now nail it with the Diva-Lite. You can nail it with the other product which is the Select which is sort of a hybrid 4 foot 4 bank, except a little smaller – same brightness, same weight, but a lot of flexibility of what we had with our original 4 bank. On the Select for instance, you have a removable ballast; you can go 25 feet remote with it; you can also go wireless and that wireless function is something that’s shared between the Diva-Lite and the Select as well as our Celeb line which is one of the earlier LED products that we’ve brought to the market. Ed: Now you’re going to tell me that you’re not going to upset all those lighting people out there who have been using your fluoro product for many, many years and you’re not going to stop making the fluoro lines are you? Frieder: Not at all. We have enough phosphor powder to last us into the next century and the bigger challenge will be beyond my control, and that is how many factories will continue making lamps. That will be the bigger question. From our perspective, we continue selling, we continue producing, it’s a very strong product with the worldwide distribution that we have; it’s still a mature product with a high quality unsurpassed in many regards.

Ed: Well I do have to say myself that Kino Flo has really become an industry standard word. I’m sure it’s in the Oxford Dictionary. When you’re talking about lighting quality, fluorescent lighting, well you have Kino Flo’s – that’s it, they’re the benchmark alongside which everything else is measured. That’s got to be good for the company? Frieder:

That’s very kind of you and I agree …

Ed: Alright, but you’ve seen the competition and certainly by your description of where you’ve gone, one major European manufacturer has got a light that seems to provide a huge range of variables all in the one, all software controlled, so you’ve really got to match that, but do you think you’ve surpassed their levels? Frieder: I think it’s the difference comparing Porsche, BMW and Mercedes. Ed:

between

I’m glad you didn’t say Chevrolet.

Frieder: No … and I think that the point is we’re coming from a very practical onset experience level. So for me, what was important is yes, the features are important and we can match all those features, but it’s how do you interact with it. It’s that intuitive element when you’re standing in front of the instrument, the director of photography is screaming at you "where the hell is my such and such, where’s that setting", and you’ve got a bunch of little buttons to push and you go “well what sequence is it now, where am I supposed to be, I can only see one value or two values at a time on the screen …” We’ve set up our display and our control so that in one glance you see all the settings, you know what your dim level is, you see your Kelvin, you can see your DMX address. If you’re in colour mode, you can see whether you’re in gel mode, whether you’re in hue angle saturation mode. Ed: Have you got this on an App so you can do all these changes on your iPad? Frieder: There are secondary App manufacturers that we can interface with, so we’re operating on a wireless system developed by LumenRadio and then there are third party applications that allow us to interface with that. More and more of these Apps are being developed that are more and more reliable. At this point reliability …

Page 8


Kino Flo Select LED Firmware Patch adds Colour mode - 2500K to 9900K. • • • • • • • • • •

Colour-correct with high colour rendering index Universal input 100VAC-240VAC or 24VDC Flicker-free, dead quiet opera�on Full control wth Lumen Radio® Wireless DMX Built-in Barndoors & 90° louver included Even while dimming, fixture without colour shi� Vast pale�e of professional cine gel colours Colour wheel with hue, angle and satura�on balance to any professional light source on set Map the lights to spectral sensi�vity curves of all makes and models of digital cameras


Ed:

Aaah reliability, yes, yes …

Frieder: Because when you’re dealing with a congested spectrum such as in WiFi, you can expect to have some challenges. Now if you’re on a set in a studio, you’re less likely to face those challenges like here on the floor of the show where everybody’s trying to go wireless and you have all these competing signals fighting for each other … it’s a little bit of chaos out there right now. Ed:

A point of difference?

Frieder: A point of difference – some of the manufacturers, if they’re trying to get to the higher brightness levels, they will incorporate fans which, if you’re on a quiet set or you’re in close proximity to talent and you’re into hour 3 or whatever of working and midway through a take a fan kicks in, that’s not going to fly. As a It looks the same but it’s now LED powered. design principle, that’s not something an important element because you still have to be able that we would ever endeavour to do. So in the LED to evaluate colour. But we really do understand the market, you basically have 2 ways of looking at light … cameras and so that’s where a lot of our time and if you want brightness, you’re going to have weight; if research has gone – understand the cameras, just like you don’t want the weight, you’re going to have to we did with film stock, so now we just say every new move that heat off somewhere, so people incorporate cinema camera is like a new film stock. They do have fans, which on one instrument may not seem so bad, their distinct spectral qualities, their spectral sensitivity but if you’re in a studio and you have 40 instruments curves that are different from each other. hanging and they all seem to cut in at the same time, well now you’re altering the ambient dynamics in that Ed: So do you see the future where you could actually room and it will have an influence on the sound. We’ve dial up a Sony FS7 into your lamp? seen it now already demonstrated in some studios in Frieder: It’s conceivable and we could in theory do Europe where we were told about these challenges that it now because we have the data. The bigger challenge they had. So the other way is, if you want it is, once you as a manufacturer offer that feature, you lightweight, well then you just expect it to be very now have to support it by staying relevant to the bright, and given where the ISO’s of these new cameras evolution of the cameras, which is a high cost are running at, most of the directors of photography in endeavour and it’s something that has to be somehow the cinema market who I’m talking to, they’re starting monetised. at 500 ISO; if they’re on a street, they’re shooting at Ed: You could call it Brand X? 2000 ISO. Most of our instruments here, when we quiz our clientele they’re saying “well really, for the most Frieder: No, no, but the process work that goes into part, 80% of the time we’re at 20% of light output.” So getting this information because none of the again, it really makes you think then from a design manufacturers are willing to share that data, so we perspective where is the most bang for the buck, have to extrapolate it out of the camera through where’s this market going? These lower levels are testing, and that’s a timely process. really becoming practical in achieving really high quality Ed: And it’s also intellectual property? imaging and so for us it’s about understanding the Frieder: Well it’s our intellectual property once cinema camera, what is its spectral response. We’ve we’ve figured it out. They figured out how to get there, tailored our colour points to really relate to the camera, and we figured out what it actually is that they did. NZVN not photopically to what your eye sees, although that’s

Matthews for PLS

you’re clearly continuing to develop little bits and pieces for it?

For PLS, we’re at Matthews with the lovely and hugely talented Linda Swope. Ed: Now Linda, just a little bit of a follow-up from NAB when you launched those 3 fantastic products, the Duty Dolly, the Infinity Arm and the Kerrigan cart, that particularly impressed me. Linda: Yes, they’ve all been doing really well, the Infinity Arm in particular, I can’t keep them on the shelf. It’s crazy, they sell out of stock. Ed: Well it’s that little bit of equipment that really you’ve got to see it and connect something to it to see "wow, I can use this in that situation when what I used didn’t work" as well as obviously what this is going to do. And it’s something that’s an ongoing product –

Linda: Exactly, the interchangeable tips, you know we’ve got a handful of them now – Baby Pin, a Mini Matthellini, a Go Pro mount, a receiver … there are many more tips coming out. Ed: Okay, but the big one this time is … again it shows just how responsive you are to the market, because until NAB, the whole 360 degree camera thing was, well, who knows, and suddenly, BANG, it’s there. Now within 6 months, you’ve come out with what? Linda: A VR stand called the Matthews VR Rig. It’s basically a stand that we created with a very narrow profile because if you’re shooting 360 you don’t want to get the stand in the shot. It comes with a Magic Finger mount so you have the ability to 360 your 360. You know it’s heavy duty, but it’s not heavy duty – it’s

Page 10


heavy enough for what you need to do. You can get it with a couple of different accessories – one is an Auger Spike because at the very bottom there’s a threeeighths tapped and the Auger Spike screws in. So if you’re on the grass, or in the dirt you screw that in and you get the stability without having to sandbag it and make it bulky on the bottom. There’s also a suction cup that you can screw into the bottom if you are, for example, on a basketball court – suction cup down, you get the staThe Magic Finger mount close-up. bility. There’s also an option to add a barbell weight, so you have that weight at the bottom. So again you avoid sandbags and making that bulky bottom that you don’t want your 360 camera to see. Ed: And it’s any 360 camera though you’ve got a OZO here? Linda: We’re showing an OZO and we’re showing a Go Pro mount. It also comes with a black skirt that covers the legs, because if you’re running your camera and your mounting lighting, you’re going to hide your batteries and your cables and everything under there to keep those out of the shot. Ed: I suppose with the stitching software that they use with 360, you don’t need to use like a green screen – you’re just using black? Linda: Well we actually do make a green screen skirt as well. Ed:

Are you going to model that for us?

Linda: No I’m not! And then also, which is really cool, is around the stand there are three-eighths and quarter-twenty tapped holes, so you can mount lights on it. You know you’re doing 360, you’re shooting all the way round, you don’t want to see light stands in your shot, so you light from the stand out.

Linda beside an only slightly slimmer VR Rig.

Ed: Again that’s another Matthews thing – you just drill in all sorts of places and you never know, somebody’s going to stick something there? Linda: Yes. And no one else is making this stand. When we first came out with it and we were getting all these orders, I asked “what did they do before?” They were just kluging bits and pieces and stuff together, and we came up with this and it’s been received really, really, really well. It’s a cool stand. Ed:

Well that’s what we expect from Matthews … cool.

Linda:

Page 11

You get it.

NZVN


Dedolight at IBC Now we have Raphael Pollock at The Dedolight stand to tell us "what's new." Ed: Now Raphael, just a quick run through – new since NAB? Raphael: We’re fully up and running with the Active Cooling range of our Dedolights. We’ve got the DLED7 which is right in between the popular DLED4 which are 40 Watt power and the DLED9. It’s the same housing as the 4 but we put a small fan inside to achieve a higher output. Complementary to this, is the DLED3 which is the same unit, one size smaller and we’re just showing here the DLED10 which is about to hit the market by the end of this year. Ed:

Is that the one with the ballasts around it?

Raphael: The double ballasts exactly, that’s taking 160 Watt LED inside the housing of the 90 Watt and we used the fan to keep it in the parameters. Ed: And it is a test I know that I’ve performed that you put your ear against it and you can’t hear it?

higher light output and you could switch off the fan if you need to run totally silent. Ed:

Now is this a new brand for you, Tecpro?

Raphael: Tecpro is the brand of Dedotec which is the engineering part of Dedo Weigert’s firm where all the lower priced products are being brought in. So this means it’s our design and our concept but we don’t do the production. Because Dedolight still is a premium product, we need to complement the market with the Techpro brand. Ed: Now also on your stand, you have the Hive range which we saw at NAB some years ago. You’ve really taken it on and you’re really pushing this cooperation with Hive Technology? Raphael: First the Hive product presents a unique technology. You don’t find any plasma lights in our industry so far and since we are one of the prime high speed houses in Europe, to get a punchy light which is totally flicker-free is quite crucial and that’s what the Hive light does. So we see it more in the high speed productions than in the traditional stills or broadcasting productions. And a unique tool which we present for the first time is the ParaBeam 1.2K light which you see up there, which is producing a 4 degree beam, almost parallel.

Raphael: It’s a very, very low frequency running fan and we don’t expect any problems to come over time. In the worst case, obviously the fan can be exchanged later on. On top of the booth, you can see 2 PanAuras 5 and 7 which have an LED driver inside. One is a bicolour 2x250 Watt driver and the PanAura 7 is a 500 Watt LED chip – the one all the way at the back with the big spread. So this is intended for broadcasting institutes which are transmitting 24/7. There the LED energy saving is quite important. At NAB it was a preproduction model, now we have the Tecpro Felloni Turbo running. Same concept, we use a fan to drive a

Ed:

4 degree spread?

Raphael: Yes. The idea is that then you use reflectors to reflect the light. You can reflect the light into the set and there’s an Austrian cameraman called Christian Berger who was shooting The White Ribbon and he’s used it for all his big wide shots, so in a church scene and if your lighting is static, some people prefer to have reflected light rather than having light from a traditional source. The benefit on a production if you commit to it instead of bringing another 4K in and one more 4K and a 2½K, you bring more and more a reflector, because a ParaBeam you hardly have any fall of light and you just put in the reflector, take it out. So that’s pretty unique.

Page 12


Matthews Infinity Arm Superior Strength. Interchangeable Tips. Infinite uses…

• • • • • • •

Strongest ar�cula�ng arm on the market Weight capacity of 7-14 kg Rig cameras, lights, monitors & more CNC tool'd 6061-T-6 Aircra� Aluminium 100% manufactured in California, USA 60 sec�on ‘No-Slip’ centre rose�e Quick release �ps: 1/4" Male & Female 3/8" Male & Female 5/8" Male & Female Ma�hellini Cold Shoe GoPro


Ed: So again really cementing the place of Dedolight in that special lighting area? Raphael: I think Dedolight is known for being really strong, or even the market leader, in portable studio lighting for all broadcasting crews. It’s definitely the special lighting for all the feature films and we always have been the speciality house, so you’re completely right that, for all the high speed work, that’s us. But the major market share definitely is in broadcasting location work, then comes film production, then comes photographers, then museums and then comes all the specialities, because you don’t need a lot of the special lights, but you do need those. NZVN

Avid for Atomise

or an ISIS system. Are there any pitfalls to customers going this way, rather than buying the genuine product?

We are at Avid for Richard Kelly with Ren Middleton from Australia and Yew-Jin Cheong from Avid Singapore.

Yew-Jin: I see, you give Ren the easy questions right … well there are certainly pitfalls in the sense that many of these solutions are sort of "cobbled together" when they try and put together what they believe to be suitable combinations of hardware and software. And like you said, it is an approximation or a facsimile of how the Avid file system works, but there are also many, many aspects of the file system when it comes to the ability to share content for simultaneous access across multiple clients and having optimal performance for each and every one of those clients. I guess the key thing to note is that, with Avid shared storage solutions, even though for Avid NEXIS we are using common "off the shelf" hardware, this is hardware that has been thoroughly, rigorously tested in the Avid labs together with our file system, in excess of 300 editing clients, and we are able to very accurately characterise real time performance based on specific performance of the specific editing conditions, depending on the resolutions that you use, the layers that you’re using to edit. So I

Ed: So you’re here to listen to Ren and to add a little bit of technical support when Ren gets lost? Ren: Absolutely. Yew-Jin heads up our presales and solution design team for APAC so he’s got a lot of experience in the industry and he’s a very clever man. Ed: Well there you go, there’s got to be one around somewhere. So NEXIS, big, big things? Ren: We do have a new NEXIS product which is E5, providing incredibly dense storage. Per unit, you can build it up to about 480 terabytes, but it can grow to nearly 1.5 petabytes. You can use that in conjunction with your existing storage, so if we’ve got ISIS 7500 users or even 5500 users, there’s a really good migration path to bring people onto the NEXIS line of products. Ed:

And that’s good because?

Ren: Well for the E5 in particular, it’s a smaller footprint, so you need less rack space for the same amount of terabytes as other storage vendors may supply; you use less power; you need simpler switching infrastructure because you don’t have as many chassis to have to account for. So there are lots of those advantages and as you scale up the cost per terabyte goes down the bigger you grow it. Ed: Now I guess here’s a question for Yew-Jin – I understand that there are third parties who put together storage solution packages and then, with a little bit of clever software, make them appear to an editing system to be an Avid

Yew-Jin and Ren. Page 14


guess, whenever we have any shared storage solution that’s released to the market, you know it’s been rigorously tested and we can put our name behind the performance that we guarantee for the customer. Ed: But surely it’s the software, since they can’t write exactly the same software that you use, because – well that’s illegal, so there must be a difference in there. Can they exactly duplicate what an Avid system can do? Yew-Jin: can.

Hmmm … many of them claim that they

Ed: What are the sorts of areas where they would fall down or a customer would notice that "hang on, this can’t be genuine?" Yew-Jin: I think that there are a couple of areas and obviously the most important area is in collaborative real time editing where you need to have multiple editing systems playback reliably the same piece of media at the same time, and that’s very often when many of these systems fall over … where you need to have this type of environment and to be able to playback your content without missing a frame. So we have many customers around the world who actually, because of the time sensitive nature of their programming, go to air directly from a Media Composer editing system. They hit Play and it actually goes out for transmission. So in such mission critical environments, you want to have the reassurance that, 10 times out of 10, it is going to happen without a glitch. I’m certainly not going to say that other vendors are not able to do it, but … Ed: Perhaps not able to do everything at the same time? Yew-Jin:

Exactly.

Ren: I think as solutions scale, that’s where some of these fall over. It’s okay for a few clients, but not many of them can scale anywhere near to our 330 clients that we’re qualified for. They generally start falling over after they have a number of clients on there. It’s all about the NEXIS file system which is really the strength. Ed: I also assume that it’s all about that upgrade path. They might be able to come up with a solution now, but when you come up with the next level can they compete, can they match that? Yew-Jin: And I think that there are also a few things to consider in that respect – whether you’re able to expand your storage without destroying your media, shutting down your system and disrupting your production. And the other thing is, as you scale your storage, are you having, for a given amount of storage that you add, a proportionate increase in performance

and bandwidth. So with Avid, that relationship is very much linear. You add a certain amount of storage, you get that same certain amount of bandwidth increase; whereas with some other companies, that is actually not the case. Another thing with scalability is not just in terms of scaling up, but also sometimes scaling down and I think a very good example of that is in the way we limit access to individual users or groups of users in terms of the amount of space that they can use on shared storage as a means of managing the storage capacity. Many systems are able to allocate for users additional storage, but when it comes time to actually reduce that amount of storage, many of them have difficulties doing so. Ed: And another area is the Avid partnership – if you have a company that you trust to work with you, such as MOG, and you have many others in that Avid partnership agreement, you know what they’re doing and it conforms to how you do things so people can trust that, if they’re on that list, it’s going to work with you? Ren: Absolutely. They’re certified Avid partners, so they go through a testing regime, they get signed off and ticked off. We provide the first line of support to make sure that the solution works and it’s all about part of our openness and Avid Everywhere strategy and our MediaCentral platform. We expose all of our APIs, we offer all of our APIs to these third parties, the same APIs that we write to ourselves so they can have very tight integration, even if they’re an opposition company or our competition, they can still write into our platform. There’s a very strict certification process that goes on. Ed: You’ve got to be careful when you expose yourself don’t you Ren? Ren: Yes well particularly in Amsterdam Grant. Ed:

Oh you’ve found that have you?

Ren: No – a friend of mine mentioned it the other day. Ed: Now talking about Avid certified partners, this is not necessarily just developers developing other software, but this sometimes can be media partners? Ren: Absolutely, in fact we’ve got a couple of examples in New Zealand where we’ve offered this service to some of the well-known customers, 2 of the major networks there. For one of them, we did the development ourselves through our Avid labs process and so that’s a paid for service; the other broadcaster chose to do some development themselves, so we offered them all of the APIs so they could access all of the information and integrate and interact with our MediaCentral platform and they’ve built their own workflows to move media between their radio system and their Newsroom MediaCentral system. So it’s worked very successfully and they’re a clever bunch of people those New Zealanders and they did the development in-house very quickly. Ed: we?

We also got a lot of Olympic medals too, didn’t

Ren: Not as many as Australia! Ed:

And now we segue to story-centric workflows?

Ren: So story-centric workflows – this is very relevant to the New Zealand market because a lot of the guys down there have our systems as you well know. But the focus has really moved from a rundown centric story within a bulletin, to more about "what is that story about?" So from the beginning of that story, it doesn’t necessarily have to be in a rundown, it can just exist on

Page 15


its own. So what you can do is you can gather material around that story; you can log in online, find out aspects of social media so you can gather that information; you can start building the story, you can start adding notes and all manner of information, but then you can also determine "okay, well where do I want this story to end up – is it going to be the 6pm bulletin? Yes it might be … alright I’ll click that one … it’s going to be social media …" So we start gathering aspects of that story together in a single location and now all of that information travels with the story throughout its entire lifecycle, depending on whether you’re going to deliver that on air or online or to social media or whatever. We’re just highlighting the first stage of that here at the show in relation to the assignment desk functionality. Now that will move forward into schedule functionality in the not too distant future. Ed: I guess that would very much depend on the metadata that went with that material when it was ingested? Ren: Potentially, and we have some camera partner alliances that can actually get the information from the iNews system out to the camera out in the field, so they have a unique ID from the get-go when they’re shooting that material. So when they bring that material back, they know that that’s actually associated with a particular News story. There are lots of workflows that are happening in relation to that already. Ed: Now the integration with bringing Orad into the Avid graphics system with Newsrooms … that continues to grow? Yew-Jin: Absolutely. Everything that we do with the Avid solution set is based upon the concept of the MediaCentral platform. With the acquisition of the Orad solution, we have worked very hard to incorporate a lot of these solutions into the MediaCentral platform as well. If you look at something like Maestro for example, which is our graphics playout and management solution, that already is integrated with MediaCentral. So from a single browser-based UI, you can grab your templates, you can text information and that will then be linked with any particular story that you’re working on. So that particular story ends up in your rundown, the graphics will also automatically be played out. Another example is with PlayMaker, which is a sports sort of OB van studio fast turnaround type solution. That’s also integrated with the MediaCentral platform in that the content acquired can be very quickly turned around, or it can be ingested into MediaCentral and the NEXIS shared storage system for collaborative editing. We’re also launching at IBC this year a new tool called Sports Logger. As you’re ingesting your live content, you can then be adding metadata information that … if we take the example of a soccer match, that pertains to individual players at the click of a preset button,

pertains to goals, penalties and so on and so forth, so that when the game is over, you’re able to very quickly call up individual players and do your halftime recap or summary, or you can search based on the metadata that you entered. All that is stored within the MediaCentral platform as well. So we have made great strides and we’ll continue to develop in that direction. Ed: It just continues the story of Avid really supporting the people who are using their products? Yew-Jin: Yes absolutely. We’re supporting them, we’re helping them maintain their workflows, but we’re also offering them far more integrated workflows as well, with very clear paths of migration for them. Ed:

And that’s key to your success Ren?

Ren: Of course it is, yes. It is in all seriousness what our customers want. They’ve wanted a platform so they can save money on gluing all the disparate systems together … we can bring it all together in a platform be it our own products in expanding the capability of our own products, or integrating third party products. Ed:

So Avid is everywhere and Avid is listening?

Ren: Absolutely, we do listen. Ed: And Yew-Jin you’re telling us that Avid is not only listening to its customers, but is actively engaging customers in a dialogue that is going to help them and you provide to them what they want? Yew-Jin: We absolutely are and through the Avid Customer Association and the Advisory Boards that are within this Association, not only are we listening to the customers, but we’re also working with them hand in hand to drive or determine the shape of the broadcast industry in the future. A very good example of that is the Standards and Practices Advisory Board. We have our senior management and our chief architects involved with senior management from broadcasters from around the world and we work on endeavouring to standardise standards and formats for things like 4K, for video over IP, even things like HDR formats and these are things that are very relevant for customers today and will be even more relevant in the future. So we’re not working unilaterally, we’re working together with our customers to do that. NZVN

Page 16


Wellington - Head Office Level 3, 127 Park Road Miramar Ph 04 380 5010

Auckland Level 2, 24 Manukau Road Newmarket Ph 021 863 324

Editing Solutions Experts

The experts providing end–to–end editorial solutions to New Zealand’s film & television industry Authorised Elite Partner and Trainer for the complete Avid Video, Interplay & Storage ranges. Official New Zealand Distributor for Avid Media Composer, Avid ISIS & JMR Storage. We are proud to have been selected as one of Avid’s top 100 partners worldwide.

Avid NEXIS Pro Avid shared storage has never been cheaper Get the industry-proven performance and reliability of Avid ISIS 5500 and 7500 shared storage in a more affordable entry-level system designed for independent editors, audio post pros, and smaller post-production houses. No need to ditch your favourite editing and asset management tools to accelerate your workflow. NEXIS Pro enables real-time editorial collaboration using what you have, whether you edit in Adobe Premiere, Apple Final Cut Pro, or Media Composer. Special academic pricing is available to approved institutions. NEXIS E2 20TB/60TB & E4 40TB/120TB Engines also available. To book our demo system for an on site test or for a system quote please give us a call.

• • • • • • • •

Base-band and file-based workflows Edit -while capure/ ingest Multi-format | Multi-resolution Transcoding Avid | Interplay Support Clip stiching FAST GoPro media transcoding Simultaneous generation of files for multiple destinations.

We have a demo MOG mxfSPEEDRAIL file and baseband ingest system available for demonstration. mxfSPEEDRAIL is a qualified Avid Partner Product. Contact us to find out more.

www.atomise.co.nz | atomise@atomise.co.nz Check out our new website www.atomise.co.nz


MOG by Atomise For Atomise, we are at MOG Technologies with Nuno Magalhaes. Ed: Nuno, we would like to know if you have come out wth anything new since NAB? Nuno: What we are proposing here is an alternative solution for what the customers are used to seeing. They are used to seeing centralised ingest solutions coming out from MOG. Now we are becoming technology co-partners for New Media Channels. This is a new positioning in the company which complements our well proven background. Ed: And when you say “New Media” you’re talking about streaming? Nuno: Steaming, yes exactly. We can provide Cloud environments as well as OTT platforms along with the possibility of having OVP systems that can manage the distribution of the channels that you have in the New Media Channels context. Ed: So in a broadcaster, this would normally have been done by another party, so they would use MOG for that ingest part, but then you would have to have somebody else to do the streaming? Nuno: Yes of course. Now we are able, with the technological partnerships that we have made, to integrate the Cloud environments directly in our solutions, so we are able to provide CDNs or CCEs where that will give the content directly to the users for them to preview the streamings directly to their players. You don’t need to refer to another third party vendor, you just need to come to MOG and we’ll set up the workflow end to end, from the 360 capture or plain capture, for the ingest, the content management through the OVP platforms and the delivery through CDN up to the players that the users will use to see the content. So basically, this is one single workflow brought in by MOG to the customer. The benefit of it is that the customer doesn’t have to pay a subscription fee, we are providing the technology directly to the customer. So we implement the solution there, he is able to create his own channels, monetise the content that he has because he’s able to use his ads according to the playlist that he wants to use. All the revenue coming from those ads will be in the pocket of the broadcasters. It will not go, as in similar cases that you have a revenue going through YouTube for instance, this is just a name as a comparison, not to be used of course … you can eventually get a little bit of money from the viewing experience that the broadcasters are providing, but not the full revenue. In this case, we have the technology in-house that you are able to manage – through the content management platform, you’re able to view the statistics of what’s being viewed or not, so you can draw your audience and you can rate your playlist according to the best content that you might have. Ed: And there are no limitations in the formats that you can send out? Nuno: No, no limitation. The benefit of coming from MOG is that we are experts in dealing with codecs as you are already aware, also different containers, so we have a development team who are more than capable to adjust codecs that might be needed. But in terms of trending codecs, that’s the codecs that are

Nuno keeping me from the bottle of Port waiting at the back.

used to deliver this media, we have already fitted in all the necessary codecs. So you have protocols that are being transmitted directly, such as HLS, RTMP, depending on the platform that you are going to have a viewing experience, those codecs are free for ingest into MP4 formats depending on what you want to use of course, and then deliver to you and determine protocol that you might want to use depending on the device that you have. Ed: Okay, that’s for the broadcaster and obviously we know that MOG plays very well with Avid and most broadcasters have Avid, so that ballpark’s sewn up, but what about for the smaller station where they don’t have a large infrastructure but they want to go into this New Media. Can you start with a fairly small package and scale up? Nuno: This is a solution … the way that we are promoting the product is pinpointed to the customer requirements. So basically, we are not going to propose a solution that is not going to be affordable or needed directly for some of the small end users or the small production facilities depending on what they want to use. But what we have here is a direct solution fitted to the customer’s budget. So if the customer already has some components inside, if he uses his own CDN let’s say, if he uses his own content management application, we don’t need to overlap that. We can just provide the acquisition part and if it’s not the acquisition part, the players that will be used to playback the 360

Page 18


or plain files depending on what you want to use. This in an OTT platform of course. Ed: I’m just interested in how far a MOG solution could help a small station, such as a regional station setting up in Christchurch, where you might have 600700,000 people. Is this a viable situation? Nuno: It’s always a viable situation depending on the content that you’re going to be providing to your viewers and the expectancy that you are going to have with drawing from these OTT channels’ platforms, because if you have large viewing numbers – even 7000 people is pretty large in terms of viewing, and depending on the content that you might have, it can be largely monetised. There’s a lot of publicity that can be included there, so it can be very useful for the small broadcasters also to get some revenue which is not withdrawn to third parties, there’s no subscription fees whatsoever, just one small amount, one small cost for implementation of the project and you’re done. Depending on the situation on how you are going to use the OTT platform, you can have 360 environments, you can have the plain environments or the normal environments, and this is where the capture, or the acquisition, plays a fundamental role because if you’re talking about the professional environment, you’re going to use Blackmagic cameras or other types of camera, and this manages the cost quite a bit. If you’re talking about the Go Pros for sports events where there’s nothing professional let’s say for a 360 environment, it will reduce a lot of the cost, but actually Go Pros are not our market, so we are aiming for the professional market. But it’s also all supportable of course. The thing is, depending on what you’re going for with the acquisition in terms of amount of media that you’ll have in terms of central ingest, it might be only using one piece of equipment, or two, to provide the amount needed for the ingest, the system can be scaled up later on. Ed: This is the point – it is a viable situation and this is where having an integrator like Richard Kelly at Atomise is ideal for you because, if interested parties talk to Richard and his team, they can tell you “you’re dreaming” or “no, that’s a good idea and this is how we can help you”? Nuno: station.

It’s always good, even for a small TV

From time to time, they can produce their own content; if they produce their own content, you can engage the

NZVN ADVERTISING RATES

audience in terms of ways that are still being explored. We have cases like The X Factor programme that we’ve made for a regional broadcaster, RTP. This is a programme that is an immersive experience in terms of viewing all the transmission made in the studio. That live experience catches an audience and we knew from that representation this is the way to go. RTP is one of our pilot stations that we work together seamlessly to understand if the technology can be provided to broadcasters or not, and whether it will be viable or not. In the case of Richard being able to promote this effectively on small broadcasters, this is totally "yes." NZVN

NZ Video News is posted free to New Zealand video production professionals - if you know someone in the business who would like NZVN too, tell them to write or phone us

Rates from April 2008 Advertisement A6 (must be set portrait) A5 (must be set landscape) A4 (must be set portrait) Spot colour - a supplement of Full colour - a supplement of Classified 40 words pre-paid cash Loose Inserts are accepted conditionally

NZVN AD RATES: - EXCLUDING GST $ 80 $140 for one $270 for two in the same issue $260 for one $250 each for more than one in the same issue $120 per A3 $420 per A3 $ 20 ($23 including GST)

AD COPY REQUIREMENTS: To qualify for listed rates, all copy and artwork must be submitted in photo copy ready form - black on white, as an Adobe PDF file - print optimised. Email to <finnzed@xtra.co.nz> AD DEADLINES:

SEE FRONT PAGE

Pay by cheque or direct credit to ANZ # 010242-0160111-00 Page 19



Pixel Power at IBC For Gencom, we are with Keith Bremner and we’re going to talk to Mike O’Connell about Pixel Power. Ed: We start here at Pixel Power because this is a big product for Gencom and I see signs around here that say “4K 4K 4K, in the Cloud today”? Keith: What would it be without a bit of Cloud thrown in eh? Yes, Pixel Power’s a company that Gencom have been working with for a long time now. Their background is with graphics, their heritage is as a classic CG, but over the years, it’s developed into much more than that. They’ve taken on the playout industry first with the BrandMaster, which was the master control switcher built on top of the graphics engine, and then over the last couple of years, they’ve added Channel Playout to that as well, with ChanelMaster, which is a full playout and automation system – again built on that same graphics backbone. Mike: Our ChannelMaster is our channel in a box solution and is made up fundamentally from BrandMaster – master control switcher, our graphics engine and a video server all in one box. We’ve had that released for about 3½ years, we’ve got a couple in New Zealand – NZRB utilises them for playout on both of their channels under our Gallium automation control. About 2 years ago we put a team together and said right, we need this solution all in software. So we’ve used all of the technology we’ve learnt over the years in our hardware and software based solutions and have designed a total playout solution which is all software. It’s software defined networking which is something you’ll hear in the IT world and we have a solution, called StreamMaster that we’ve sold already and it’s deployed around the globe. This allows a lot of good things … with software, it means that you can run it on standard off the shelf hardware, we’ve got a hardware box that we can put our HD-SDI card in, which is new for the show, or you can run it as a virtual instance in a virtual machine, or you can run it in the Cloud as well. We’re actually playing a channel out of Amazon at the moment, so this channel can be controlled here from the show floor and is playing out of Amazon. It’s all the same software no matter what technology platform it’s deployed on. Ed: So this is not necessarily off the broadcaster’s own server … you can be taking material in from other streams and incorporating it into your playout? Mike: Yes, StreamMaster can take standard OB feeds, whether they be HD-SDI or IP, and mix into those and do all your mixing between programme, commercials and promos. The beauty here as well, now that it’s software, is that StreamMaster playout solution is more powerful than our ChannelMaster hardware solution. We were always limited in the past where we could only have one keyer that could do real time 3D graphics. It was really good elaborate real-time 3D graphics, but now that it’s in the software world, we can

Keith and Mike.

put 2 or 3 of these software graphics engines in the one master control playout and that can really give you a whole new graphic workflow and look. Ed: Is most of this programmed or real time? Mike: It’s all real time. The schedule comes from a traffic department and gets ingested into Gallium which then loads the StreamMaster with the correct media assets ready to play. StreamMaster will play these assets as per the schedule and will do all the necessary mixing between programmes from that schedule including keying on secondary events such as graphics and voiceovers. When you’re doing the mixing to live feeds and everything, it’s all real time live. The graphics are played in real time, they’re not pre-rendered. We do have render solutions, but with our playout, you can do everything in real time. The Amazon graphics – the graphics we do on standard "off the shelf" hardware or in a virtual environment or on the web have the same feature set. It doesn’t matter what technology platform the solution is deployed on, I can run real time 3D DVE on off the shelf hardware or in Amazon. There’s no difference. Ed: When you mean “real time” you’re not actually talking about having an operator there constructing the graphic or flicking a switch or something. That’s programmed but the actual creation of the graphic is in real time? Mike: That’s correct. Your graphics department would be creating graphic templates offline. They don’t do that on a day to day basis, because normally a channel will have a look for a specific time and so they create those templates, then the templates are housed within the automation system and in the playout device and from these templates Gallium automatically creates the graphic using metadata from the schedule or other databases and StreamMaster plays these graphics in real time and automatically shows Now, Next, Laters showing what programmes are playing. Ed: So in other words, it doesn’t all have to be rendered into a single file like a timeline and played out, it’s actually constructed on the go? Mike: Yes. Ed: So changes can be made at any point?

Page 21


Mike: Exactly and you can even delete the next programme, insert a new programme, Gallium automation system will see that and will go and recreate that graphics without any human intervention. Ed: So Keith, how do you sell this to a medium / small broadcaster who has currently a playout system that they might have had for 5 years … has the change in the cost and the ease of workflow become so great that this really is a viable thing to put higher up in your priority list? You’re talking from one side of this argument, but obviously, you’ve got to make this case to these broadcasters? Keith: There’s a lot of cost to a broadcaster to refresh and update their entire presentation playout system. You’re talking about a large investment so there’s a drive to do things in the virtualised domain and even in the Cloud. The Cloud playout service is where these things become a monthly software service cost to them instead of a large capital … Ed: And you can offer this? Keith: Yes, we can sell a StreamMaster system, running on Amazon today. Ed: Wow, so this is basically a monthly expense rather than a huge capital outlay.

doing a UHD channel playback with master control presentation, with automation and there’s no hardware here. It’s all done on Amazon, or alternatively on a customer’s own virtual machine platform. Ed: And training, so something like this, how do you go about training if you install this into a customer’s base? Keith: Well the operation side of it is no different. The user interface, whether it’s on a hardware appliance, or on an “on prem” virtual machine, or if it’s on “software as a service”, the training required is exactly the same. But behind the scenes, under the hood, obviously you need people to be a little bit more skilled with the likes of Amazon Web Services for example. That’s the big change. Mike: We see the benefits of that on the booth. Up to when we had software only solutions playing off servers, we had a whole lot of HD-SDI cable, a big router in the back … now we can wire things up very quickly just by plugging Ethernet cables in. It’s saved us most probably a day and a half in setup time for the show.

Mike: It can even be hourly. Ed: I’d be worried if I was a television station and had to buy the playout service hourly. Mike: Yes, but for popup channels, like Olympic channels, or there could be a local community event. You can bring a channel up and just pay for it during that period of time. Keith: Same for disaster recovery, people don’t want to invest a large amount of money to have a DR site sitting there doing nothing for almost all of its life when you can simply enable a channel on a virtual machine, or on Amazon, and be up and running within a few minutes and you’re only going to pay for that hour or week or month that you might need to use it. Ed: So I guess, having software replace all this hardware is something that’s a bit hard for you Keith to explain to your clients because, over the years, you have sold them some pretty expensive hardware? Keith: Well they’re trying to explain it to me, how they want to go onto software for those reasons. It’s all about agility for them and ease of being able to pop up a channel up and down. Ed: Because a bit of hardware can break quite easily and, if one bit stops, the rest of it’s not going to be much use is it? Keith: The software doesn’t break though does it J Ed: No, not if it’s installed properly. Keith: Getting back to that agility and keeping up, you can keep a product updated much faster if it’s software rather than hardware. Ed: Yes. Do you agree Mike? Mike: Yes, totally agree. It’s a lot easier to put a channel on air and you can do it long-term or shortterm. The cost involved with a hardware solution, you can’t justify that. Keith: And now we have to deal with not just HD but UHD coming along and you’re not going to see UHD being built on classic hardware, it’s all happening in the software domain, as you can see right in front of you,

Ed: And it would save the broadcasters the same if not more? Mike: Oh yes definitely, it’s all relative. And when they have to add new services and they’ve got a public or even a private Cloud, ( they might have their own Cloud ), within 10-15 minutes, they can have a channel playing out. On Amazon, it takes about 8 minutes to have an instant start, so StreamMaster can start within about 8 minutes and that can play out green field material. For a DR solution that’s great, something’s going out straightaway to the viewers, then the automation takes about 4 more minutes to start after the automation instance starts on Amazon, so within 15 minutes, you’re playing out your schedule as if nothing’s gone wrong. We see DR … Ed:

And “DR” is Disaster Recovery?

Mike: Yes, sorry! We’ve got a proof of concept in the Middle East at the moment. They’ve got a traditional automation system, video server, Master Control switcher, graphics device and a MAM. It’s all traditional broadcast just like we do back in New Zealand. They want a DR solution and money has been the objection to that, like it has for everyone else. So what we’ve done is we’ve set up an Amazon instance of our Gallium automation system and StreamMaster our MC playout device. Their on premise MAM from their

Page 22



playout studio is sending media assets for the next 4 days up into S3 storage in Amazon and at the same time, we receive the schedules from their playout system, and Gallium automation sees all of that, makes sure that all the media’s there, and if not, it will request the missing assets. Gallium runs the schedules in parallel to what they’re running on premise. At the same time, if they edit on their facility, they edit a playlist, insert a new programme, go to a live event, we see those changes within the Amazon system and do exactly the same thing on their DR channels. So now, if

Object Matrix at IBC For Gencom, we are here at Object Matrix with Keith and Nick Pearce from Object Matrix. Ed: Keith, you brought us here because you reckon this is something that more people in New Zealand should be involved with? Keith: It’s an interesting twist on storage. Object Matrix is a company producing object based storage for the media industry. Ed: And object based storage is what? Keith: That’s really hard to explain and that’s why we’ve got Nick Pearce here. Nick: Object Matrix is a software company and we created an object storage platform called MatrixStore. What it does is that it effectively allows you to have your own private Cloud. Basically, if you store your data in any sort of Cloud based platform today, at the back end, there’s going to be object storage the reason being that object storage is selfmanaging, scalable and very low cost of ownership in terms of management, adding capacity, etc. As the demand for more and more storage increases where changes in formats HD, UHD, 4K, 8K – there’s more pressure put on organisations to store more and that means they have to manage media, find media, share it and move it between different platforms. So companies often end up with silos of storage and have no idea what’s on those silos and where to find things. Well object storage allows you to also protect the metadata which is data about the data, with the content, so that if you need to find it, it’s like doing a simple Google search on your own private Cloud storage. What’s different about our object storage is, as a company, we only work in the video space.

their system goes down, all they have to do is hit one button and Amazon is feeding their viewers. Ed: Why hasn’t everybody got one? Mike: They’re all moving that way I can assure you. The move is to go, especially for DR, to use a public service like Amazon or Microsoft Azure or any of the others out there. In the US, we’re seeing that the main broadcasters are stipulating that their next investment in playout is virtual, and all software. NZVN

Nick: They’re making multiple copies of content in object storage platforms. Object storage has been used in the IT world for a long time and that’s where our background is. We’re ex-EMC; in 2001 we were part of the team that created a product for EMC called “Centera” which was the world’s first commercial object storage device and, in 2003, we bailed out to see if we could create our own platform. We’ve got a long heritage in object storage and, as I said, we’re only focused on the video market, so pretty much every member of our team understands video workflow, the challenges our customers are facing, and our product, whilst it is horizontal to many markets, we’ve done a lot of work to vertically integrate it. For example, if an editor is in an Avid environment, they don’t know MatrixStore there because we built plug-ins so they can just drag and drop content within that environment and it will be automatically archived. Likewise, we’ve built in automated disaster recovery so that, if you’ve got content on servers in one location where access is diminished because of a local outage, you can automatically go to the second location and get your data without any management or human intervention, which is really important. As the volume of data increases, so does the work in terms of media management. MatrixStore, our object store’s platform, reduces that amount of work, so much so that, if you buy a MatrixStore cluster, you get a free pair of slippers, because you’ll be able to put your feet up a bit more.

We understand our customers’ problems and we’ve integrated our storage into the applications that they use, so it’s non-disruptive and seamless. Ed: So you’re saying that a lot of the storage today on the Cloud is already object storage but it’s not your particular flavour of object storage? Nick: I really would be happy if Amazon was using our object storage to provide their services. Ed: But they are using a version of object storage? Nick and a lovely lady who happened to be close by. Page 24


Ed: So a major broadcaster that has an Avid system, an ISIS storage system and Newsroom and everything’s Avid, Avid, Avid – they’re not using an Object Matrix type system … they are using object storage? Nick: More and more, yes. They do have their ISIS for their production editing, but you can’t keep growing that and the new NEXIS platform is fantastic but it’s not a system that’s designed for scale of petabytes serving multiple workflows across the business, so yes, they’re implementing object storage’s nearline. France Television has 12 clusters, the BBC has 3 clusters … Ed: So in other words … I’m still trying to get my head around this, they might have an Avid system, but you can come in on top of that and provide them with an object based storage that still works with their whole Avid system? Nick: Yes. We sit more on the side of it. They have their Avid system, their Avid ISIS and Interplay, their editors use that, and we’re a storage system that sits by the side complementing it and we’ve built tools to make it easy to migrate data between the 2 platforms. So it’s another tier of storage. It’s basically like the analogy of a wanting to eat a pizza: If there’s hot pizza in front of me, and I want to eat that now, the time for me to get that pizza, the Time To First Bite (TTFB), is sub 1 second; If the pizza is cold and in the fridge from last night then that’s another tier of storage, it will take slightly longer to go and get it. If the pizza is in the freezer then I am going to wait a while before I can actually get to eat it.

If you are ordering your pizza to be delivered then you are not sure when it will turn up and you pay for it to be delivered. So if you imagine, your editing storage is like having the pizza in front of me; your nearline is a bit further away, and your frozen pizza is your LTO, because it has a longer Time to First Byte (TTBF) as it may take a while to go and find it and actually get the data you need. Pizza delivery is just like ‘public’ Cloud, someone else is looking after your data and is responsible for delivering it to you, but again it could take a while to get hold of and you have to pay to get your own data back. Ed: And you sit in that nearline area? Nick: Yes absolutely, we’re a nearline archive. Ed: Right, so it’s in parallel with your Avid system? Nick: Absolutely, yes. Ed: Okay. Now what’s the difference between a file based storage system and an object based storage system? Nick: Basically, the fact that, with an object based storage system, you can define metadata, have extra metadata about that file with the object. On a file system, you have to have a separate database that has the information about that file held. With objects, that metadata is held with the object as a sidecar file or wrapped in the object and we have multiple copies so that you’ll always be able to search and find your data without having a big database or metadata controller pointing out where your data is. Traditional systems have metadata controllers, servers that know where the data is on the storage. If that metadata controller goes

Page 25


down, you’ve lost access to your storage. With object storage, an entire server can go down and you’ll still be able to access your data because that metadata is attached to the objects wherever it is in the system. So it’s much more resilient and scalable which is what the whole message of total cost of ownership is ... you don’t have to manage that. Ed: What about in the Sony disc archive system where you can put in a cartridge of discs, the system reads the metadata and stores that somewhere, then you can take the cartridge out and put it on the shelf. When you want that material, the system finds the metadata and says "oh you need to insert cartridge A62." How does that work if your metadata is on the cartridge and not in the system? Nick: Well it can be in both. It will be in the system and it will be on the cartridge. So especially with us, we’ve integrated with Sony ODA; when we move stuff to ODA, we put the metadata with the object on the cartridge and also it will be in the Sony system using the APIs to make sure that it can go and find which cartridge it’s on. The big difference with deep archive is that there are certain environments where, if you need to reuse data very quickly or frequently, you don’t really want it on an offline format because it takes time to go and get it and bring it back. But you will need to have ODA in scenarios where you need to preserve that content long-term for insurance purposes or you know you’re not going to touch it very frequently. But in an absolute disaster, you might go back to it and be able to bring it back. Nearline storage today is much different to what it was 12 months ago in that more and more people are realising that they need to monetise their content. They can’t put that on an Avid system because you need too much of it; putting it on tape is great but it takes time to get stuff back, so that nearline portion and also the Cloud side of things is growing because people need to make money from their assets and you can’t make money from an asset that’s sitting on a shelf. Keith: So how does it actually interact with the file system specifically? Do you interact with the disc directly or do you have a single file that you’re storing binary data on? Nick: Each node is its own system and it has its own file system. So when we send data, it can go onto any node in the system, even if it’s a folder. That folder can be split up but we keep the metadata in a folder structure. So we have file system interfaces, either as a local drive like Avid does, or SMB and FTP, so you can see your vault which can be petabytes big as a file system. But down at the base level, the objects just sit on a standard Linux file system on standard Linux supported hardware. We don’t do anything to the hardware. Our system, our software MatrixStore, rebuilds the view of the file system when you want to see it as a file system. If you want to see it in a GUI view, you rebuild that view and it’s all API driven. We’ve had an API since 2005 and one of the reasons we didn’t sell much until 2009 is because nobody heard of or bought APIs in 2005. So we had to build applications and integrate with the industry in order for people to buy our solution. Since we’ve done that, and we continue to do that, we’ve become more successful. Ed: And Keith you see a greater potential uptake in the New Zealand market for this because this is not only applicable to broadcasters is it – I imagine that

many postproduction houses should be going this way just for the security of their clients’ data? Keith: Every content producer needs somewhere to store their content and with the resolutions getting bigger, UHD coming along, content storage is becoming more of a problem to solve for people. New at the show here is making MatrixStore available as a software only solution to put onto your own bring your own hardware. It doesn’t have to be on MatrixStore hardware anymore, and this should appeal to customers, who prefer to use their enterprise storage partners. Nick: It’s something that we’re doing for customers who need 500 terabytes and upwards. Many organisations have framework agreements with SuperMicro, HP, Dell and even though we’re selling an appliance, it’s difficult to get that new hardware in because it doesn’t fit within their framework. So the reality is that we sit on top of a bunch of Linux; if a bunch of Linux supports the platform, and the bill of materials matches what we specify, then we’ll support it. But it’s not a "download it off the website" option, it’s a collaborative engagement with our partners Gencom, us and the customer, to understand what their requirements are. Also, a number of customers have gone down the road of what we call ‘project versus product’. So they’ve gone and bought petabytes of SuperMicro, built open source solutions like Lustre on top, found that actually, whilst it was cheaper to do that initially, it’s incredibly expensive to maintain and isn’t fit for purpose and it’s actually then more of a project internally than a supported product. So as a supported product, those SuperMicro chassis we can reuse, run our software on top of and turn it into a commercially supported product that integrates natively into the way that they work. One of the big benefits of being a software company is that decoupling, but we’re still selling the appliance because many customers just want to have it turn up, turn it on, make it work. So we’re adding more flexibility into the way that we and our partners like Gencom can deliver the solution. Keith: A surprise here is it’s not just storage, Object Matrix also provides you with a lightweight media asset management system, called Vision. Nick: Yes, whilst not a MAM it does offer some functionality found in many MAM systems. It’s basically a view on your data in MatrixStore and for many organisations who want to digitise legacy content, or share data internally, that's all they need. We have a product called DropSpot, it’s been around since the beginning. It looks a bit like it was developed by the Soviets in the 1960s, but it works and all of our customers use it. But one of the things you can’t do with DropSpot – you know you can add metadata, you can search, but you can’t see what the file is, you can’t preview it, you can’t do all that, and you have to install it on every machine. Today, everyone’s used to nice interfaces with their phones and tablets … our customers want more. So if they don’t have a MAM, or they do have a MAM, they’re still using Vision which is a browser based product supported on chrome mainly, that allows you to see the assets that you have in your nearline archive, regardless of whatever application archive they’re into MatrixStore, you can browse it with a futureproof application, you can share links with staff members, share vaults, add metadata search. It’s a really powerful intuitive interface that comes now with this solution. NZVN

Page 26


Cartoni Support We’re at Cartoni for Gencom with Luciano Belluzzo from Cartoni. Ed: Now, I understand some rationalisation of the tripod range has taken place Luciano? Luciano: Absolutely. One year ago at NAB, we started with a new Focus range to rationalise all the ENG products from 8, 12, 18, 20 kilos. At the last IBC, we just started with the Lambda range so 10, 25 and 50 kilos and now the bigger heads on the Master range. We have now the new first 2 Master products, each is a 30 kilo, one for cine application and one for broadcast. They support from 0 kilo to 30 kilo payloads and they have very competitive prices. We have tremendous congratulatory compliments from the camera operators who are really the end users and know exactly how to handle this head and we are extremely satisfied about the reaction. The products will be in mass production by the end of this year, so we will start supplying about early January. The Master 30 range is characterised by the tremendous fluid and drag and tilt continuous balance. This allows the operator even to make a figure eight shaped movement, which is typical in Asia and Japan, and you don’t have the return system which allows very purpose shooting for viewing both for cinema and broadcast. It is completely compatible with all the standard plates, so the end user can put it on a different tripod, even if not Cartoni, which allows a tremendous flexibility during operation. Ed: Okay. So existing operators aren’t disadvantaged because you’re still maintaining repairs and spare parts for all the range? Luciano: Absolutely. Cartoni keeps, I should say, spare parts from around 20 years ago – even small pieces. Of course, the demand is becoming quite rare at the moment, but we don’t forget that we always offer 5 years warranty on all our heads, legs and pedestals which allows the customer to be completely covered in terms of service assistance. We have very good and trained dealers spread all over the world who have been trained in our Cartoni factory, they know how to handle the problems and we have a continuous exchange of information. So the customer is assured about Cartoni quality. Ed: You’ve got no wooden legs left though? Luciano: No, we don’t, I’m sorry.

Luciano and Keith.

products, even though the current product is still working very well. We’re still repairing and servicing heads which had been made in 1985-86 and we have rows in the factory – they say "this customer should buy a new head", but they are still performing so well, they don’t want to miss their old heads. That’s the strength of Cartoni. Ed: Okay. Has this rationalisation enabled you to put some of the lessons you’ve learned from certain models and certain ranges and incorporate them all into the new ranges? Luciano: Yes, well, at the beginning, Cartoni was developing products continuously putting new products in the market with different names and different characteristics, which sometimes may overlap each other. Now with the rationalisation, we are offering to the end users a complete range of products needed during shooting in broadcast or in cine. That’s the great advantage we offer. So we move from 0 kilo up to 95 kilo and using the same components, using the same technique in manufacturing and assembly. Ed: And the customers can be assured they’re all still made in your factory in Rome? Luciano: Yes absolutely. Designed and manufactured with pride in Rome, Italy … everything, everything, even single pieces of plastic are made in our Cartoni factory. I really invite all customers to come to Rome to see the Vatican, to see the Colosseum, to see the beauties of the Eternal City and then pop into Cartoni and we will be delighted to show you our Cartoni factory. Ed:

Ed: So in terms of the new availability of heads and legs, if somebody has a set of legs that might be for a particular tripod, but the head’s gone, can they now replace their head with one from the new range?

I’ve been on the tour myself and it was amazing.

Luciano:

Thank you.

Ed: It certainly made me more confident with the Cartoni head and tripod that I have, to know that it’s going to outlast me. Luciano: We are saying that we are proud to be the only European tripod manufacturer.

Luciano: Absolutely. Ed: So it’s pretty well always interchangeable? Luciano: Yes, absolutely. That’s the flexibility of Cartoni. We don’t want to forget our existing customers who, in the past, bought a different head. The continuity is guaranteed. We offer continuously new

Ed:

Oooh that must have upset some people?

Luciano: (hearty laughter) We are, definitely we are. We are the only European factory making tripods and heads.

Page 27


Luciano: We are saying that we are proud to be the only European tripod manufacturer. Ed: Oooh that must have upset some people? Luciano: (hearty laughter) We are, definitely we are. We are the only European factory making tripods and heads. Keith: I just want to let you know that we’re partnering with our friends at DVT ( Digital Video Technologies ). They are now stocking and selling and showing in their showroom the Cartoni range of tripods and legs and systems. Ed: And this is a good thing because? Keith: Well they have a great shop with a showroom – a new shop too in Great North Road, so it’s ideal. They’re in a good position there and they can do what we can’t because obviously we don’t have a shop. Please go and visit the boys down there, they’d love to see you. They’ve got a good range of the systems in stock and they can supply straightaway.

Ed: But if you’re out of Auckland? Keith: Drive … otherwise I’m pretty sure, the good ol’ postal service thing still works … they can deliver anywhere. NZVN

Page 28


Net Insight for Techtel We are at Net Insight for Techtel and we have Olle Waktel from Sweden. Ed: Now, Net Insight … what I know is that this is high on the list of Techtel’s products that they represent in Australia and New Zealand and it’s product that one might think is only applicable to the broadcasters, but in fact the solutions that they offer are going to have wider implications in the production market, not only for sports, but for News and I’m sure in many other applications – is that right Olle. Olle: Thank you for giving me the opportunity to share our thinking and our vision in this area. We are of the strong belief that there are disruptive changes going on in the full value chain; everything from production to contribution to distribution there are disruptive changes going on. We

are active with solutions in all these 3 stages of media production. Ed: You say “disruptive” but is that in a good way or in a bad way … is it changing the status quo?

Page 29


Olle: It is changing the status quo and it means benefits for almost everyone in the game. So if you look at remote production, if we start at that end … people thought that it was just a cost-cutting exercise, but they now realise that the people who have been doing remote production for some time can not only produce more content, but with less cost and less resources. They also accumulate more content in their archive, so instead of having a bunch of camera feeds going into an OB van and returning one highly compressed feed to their archive over satellite or any other link, they now return all the feeds to their home studio and all relevant parts of those uncomIt all happens in there somewhere. pressed high quality feeds go into the archive, meaning that they have more content in the archive and the fibre, it can be Wavelink, it can be MPLS – anything, all content is money in their bank. the traditional telco infrastructures that are available. So we put our Nimbra gear on top and we do very Ed: But it didn’t start that way did it – in those early smart things with protocols to ensure the quality … days, and still I’m sure in many applications, it’s not a case of getting high quality content back to the Ed: You’re not actually going to tell us what they are? Newsroom especially, it’s just getting content back and Olle: (laughs) What we do is we take all the feeds and “don’t mind the quality, feel the width”? we synchronise them. So if you have 1, 2, 3, 4, 5, 8, Olle: If you go back in time, a satellite link back was the only option. You couldn’t do it any other way. So you had to figure out how to solve the problems. The obvious solution then was to bring a small studio onsite, do the production onsite with a full camera crew and production crew and maybe go in there 1 or 2 days before and then they have the match on the Sunday and 45 to 90 minutes, or 2 hours at the most, and then you go home again. So it was a very poor utilisation of those resources. But nowadays, you have the ability to bring all camera feeds to your home studio over a high quality network. So that gives you the ability to take them uncompressed over the network and what we hear from our customers is that, after they have spent more money on the wide area network bringing the uncompressed or slightly compressed feeds back to their home studio, they still make a 50% cost reduction compared to the old way of doing it. Ed: So certainly the setting up of an OB van, all the technology that goes with it and the time involved and the people, that’s a huge capital cost to get started, so with this idea of remote production, you still have a large initial capital outlay? Olle: No you have a smaller one because every time you have a technology upgrade, in the old way of working you need to upgrade both your home studio and the OB van. Now you only upgrade your home studio and you only have cameras on the other end, so you actually reduce your investment in the full value chain. Ed: Okay, so what is the Net Insight technology that gets that uncompressed signal from anywhere back to the studio? Olle: We are using the Nimbra MSRs (Media Switch Routers ) and they are designed to sit on top of any type of playable underlying infrastructure – it can be

20 – as many feeds as you want from any venue, we will make sure that they come exactly aligned and synchronised even if they take fibrous routes across the network, because one of the beauties of what we are doing in our sort of IP network is that we have a synchronous transport. We treat everything as IP transport for media, so we have the strictest control over the quality of service. We reshape all the traffic out through each node it touches because otherwise typically you would have a nicely shaped feed coming into the network and after a few hops it will be more and more bursty. But we handle that, we make sure that all traffic carried across the network is nicely shaped from ingress to egress and that each intermediate hops in the network, and we align all the feeds so that they are synchronised in between one another, so if you cut from one camera to another they are all viewed in perfect harmony and sync. So we invest in all these technologies to make sure that broadcast quality is met even if you’re going over a wide area infrastructure. Ed: What about when you’ve got telcos that don’t have very large pipes, or you’re in an area where an infrastructure from a telco is not very wide. Can you put your technology across a number of telcos? Olle: Our synchronisation protocols go across telcos, so it creates a sync overlay across. If you have 1, 2, 3, 4, 5 telcos underneath from A to B, we will make sure that the synchronous behaviour is maintained from ingress to egress. Ed: Even if technologies?

those

telcos

are

using

different

Olle: Yes. Ed: Even if one’s fibre, one’s copper and one might be satellite? Olle: Yes.

Page 30


Ed:

Wow, I’m running out of questions.

Olle: Because we are creating a synchronous overlay end to end and by doing that we eliminate many of the problems that the people otherwise struggle with. Ed:

And are there any delays in the signal?

Olle: There is always a delay in … Ed:

Aaah finally I’ve found something!

Olle: No, no … we have an extremely short delay and it’s always constant. Going back to, if you have multiple camera feeds, we will make sure that they are exactly in sync, meaning that they will have the same low latency, so that you can mix everything in the control room. Ed:

Are you talking milliseconds or seconds?

Olle: Milliseconds. Ed: Well that’s not really a delay … haven’t you watched television in America? Olle: Well you could say that television in America, I think it’s 5 seconds behind true live, but I think that’s because they have a lot of legislation and stuff like that. But going back to remote production, I think there are benefits for the local telcos – they will get more pipes and better pipes and they will get more revenue from broadcast. But the broadcaster will save 50% at the end of the day because they streamline the workflows and they have a better utilisation of their investment in home studio and people and so on. Ed: So you still have to have some sort of an OB setup where you’ve got … say a sporting event, you’ve got multiple cameras, they’ve got to feed into some central point and you’ve got to have somebody who might be switching and somebody who might be directing and all that sort of thing, but you basically just have to have that connection node to the networks instead of all of those switchers and other machinery? Olle: So you have some camera control units and other stuff that’s sort of onsite and you might have some robotic cameras, but you also might have some cameras controlled by cameramen. But you don’t have a multi-million dollar OB van there, you don’t have a costly satellite link and you don’t have a team allocated for days for an event that is a few hours. But yes, you have some smaller amount of equipment there onsite. Ed:

And I guess the whole system’s scalable?

Olle: Yes, yes. All the feeds that we carry across the network, either it can be … if you’re talking on the lower end, we can have a JPEG2000 compressed feed … Ed: Because here I’m thinking of streaming services where it might not be going to broadcast, it might be going to a streaming service. So is this something that you can scale down depending on the requirements? Olle: We would bring in all the uncompressed or the JPEG2000 compressed feeds back to the studio. What you are talking about is something that will take place in the studio, then they will downconvert it to streaming formats and so on after they have produced it. Ed: Not necessarily. If you wanted to save on bandwidth you could use your technology to send out a more compressed signal? Olle: You can of course go on a higher compressed version, but then you will lose the ability to adapt the feeds completely. If you want to have full editing capabilities you want an uncompressed feed or a JPEG2000 compressed feed. If that is not an option for you, of course you can go with MPEG4 or any higher compression method. But the preferred "modus

operandi" is to have an uncompressed, low latency feed. If uncompressed is not possible due to limitations in bandwidth, you can add JPEG2000 compression and then you take away 90% of the bandwidth, but you still have a perfect lossless compression that can be reconstructed. So you have no generation losses if you go on multiple compression/decompression. Our JPEG2000 has been heavily praised by everyone in the industry, so we believe that we have the best JPEG2000 codec out there, and it’s built into the chassis so it’s an option for you. You can just switch it on if you need it. So it’s not a separate box, it’s a feature that you can switch on or off. Ed: Now taking this a bit to the side here – JPEG2000 I understand is something that’s big in Europe, but elsewhere it’s not really used. Certainly I don’t know of it being used in New Zealand, but in this case, it doesn’t matter, because you’re compressing with JPEG2000 sending it out and then at the other end you’re decompressing it? Olle: Our customers can do it completely uncompressed which will give them the lowest possible latency. If they need to, they can compress it with JPEG2000. So it’s not something that we enforce or require or push them to do – it’s just that it’s available if they need it. Ed: But it is coming back to normal at the other end so the fact that it’s JPEG2000 or XYZ doesn’t really matter? Olle: We say that you can’t really tell the difference between a completely uncompressed and one that is JPEG2000 compressed. Ed: Okay, but can you tell the difference between a 4K signal and an HD signal? Olle: Yes. We support all the formats from HD-SDI to 3G-SDI to 4K and even 8K, so as well our system is scalable, you can just cascade them and you have as many K’s as you want really. Ed:

So you really are offering futureproof?

Olle: Yes. We were demonstrating a 4K multicast from Stockholm to IBC, I think it was 4 or 5 years ago. I think we got an award for that. So we were at the head of the market. But we proved at that point in time that our technology could do it. What we have added after that is the ability to synchronise the various feeds so that if you have multiple cameras from a venue, all the cameras will be in sync, because if you can do that, you eliminate many of the other problems that would occur in remote production. So, in essence, at the end of the day, it gives the broadcaster a 50% cost reduction and more content in the archive, even after they spend more money with the telcos. It’s a win-win for everyone in the industry. The only guys that typically are a bit disappointed in the beginning are the people that want to go onsite, because they meet people, it’s an event, something nice … Ed:

Or people who build OB trucks?

Olle: Some of these guys actually put Nimbras in the back of them so that they can be more flexible and meet more customer demands, but at the end of the day, there is so much power in a 50% cost reduction, increased quality and more content in the archive, that that sort of value proposition doesn’t go away. It will, over time, be the way of doing everything everywhere, and if you then put remote production into context, which we think is happening everywhere, I think that if

Page 31


you go to the contribution side of the industry, broadcasters ordering contribution from one site, 2 sites, 200 sites – they typically broadcast on a static leased line from their telco on a one year contract, 2 year contract, a 3 year contract, so you could only really afford doing the most important links and if you had “on demand” needs you put them over satellite or a courier or anyone that could carry a disc. But what we see now is that the contribution networks globally and regionally go to any “on demand” networks where you’re basically connected to the Cloud and you can reach anyone in any format “on demand”. So your broadcaster will typically have 1 or 2 feeds permanent feeds and new dynamic on demand services but then they have the ability to order more “on demand” on top of that and we see customers in the US started doing that and they’re now connected over 500 broadcasters and have 140 sports venues connected to their Cloud, built on a Nimbra network technology … Ed: And you’re not relying on an overseas director to choose the shots that you want … I guess though one thing does come in here and I do want to finish this off shortly … if all this material is coming from a third party, then how much of it are you paying the rights for? Olle: We are not involved in that part of it because we only facilitate that there is a proper networking technology to bring the content from A to B. But what happens when the media contribution networks start behaving like the phone network, meaning that you pick up the phone and you have your 4K call and you hang up and you’re billed for what you have used – then the utilisation of the infrastructure goes up, because the one providing that can sell that capacity to someone else when you’re not using it and you don’t have to buy it on 3 years’ contract even if you only use it 2 hours

every Sunday. So the utilisation goes up and the telcos can have more attractive prices. So there’s win-win again. When the media is in the Cloud, the contribution network behaves like that; it will also facilitate that broadcasters can outsource their production and they can utilise resources that it is in one campus by all campuses. So it will change how the industry can use their resources more and more efficiently. And to do that, we acquired the company ScheduALL last year. They are the software engine that allows you to book resources in the network in a reliable way. But it also allows you to streamline how you are doing your production and all the workflows that you’re doing in the postproduction company or a broadcaster – all these workflows can be booked in ScheduALL so that you increase the utilisation of them, still avoiding conflicts, because when you increase utilisation the risk of conflicts increases. But if you have a proper tool like ScheduALL then you avoid these conflicts and these systems can then automatically order the capacity across this to any media Cloud. So that’s the beauty of where we think the industry’s going. Ed: hit.

I think you’ve hit the nail right where it should be

Footnote: In New Zealand Net Insight and the Nimbra MSR is well established as a key enabler of high quality media transport. Kordia became a Net Insight customer in 2006 and an extensive network of Nimbra equipment is used for contribution services for broadcasters of Video and IP/Ethernet services. In 2011 Kordia selected Net Insight and the Nimbra 680 platform for the Auckland Metro network. This expansion covers new network sites in Auckland and Wellington. The existing contribution network is also used to provide live sports video content to Kordia's customers all over the NZVN world.

CONNEX Wireless Transmission For Panavision, we are at AMIMON High Definition Wireless, we have Uri Kanonich, VP Marketing, and we’re here because of the CONNEX system. Now this is a wireless control and transmission system for high quality video and audio and Uri’s going to tell us really what makes the CONNEX system different from other ones because we know that there’s a huge range out there from some quite cheap versions up to some very expensive systems. Ed: Uri, is there a particular area of expertise that the CONNEX system fits into? Uri: Well, if you look at the market, let’s separate between the analogue system which is older and, as you said, cheap or low cost – let’s not insult them, but certainly they are not in the same league. We are talking here about the fully digital system which is full

HD. So again, just one last word about analogue … analogue will be standard definition; of course the analogue signal is quite similar to the quality and the performance you got from analogue TVs in the 80’s. So if you want to use that you can, and some people are still with that, although I don’t think that too many of the professional guy are there!

Page 32


Ed: Can I just interrupt here ( which I often do ) I know that in the days of analogue telephone systems, when they first came in, the reason some people kept their analogue telephone was that you always got some sort of a signal; whereas with digital, you either got the full signal or nothing. Now I guess with a system like this, wouldn’t it be better to get some signal rather than nothing … is that a reason why one would hang onto an analogue? Uri: It would have been a reason if you were rangewise or something like that where you are limited by that. We are not. So the full CONNEX system gives you 1000 metre of range. I don’t know too many of the professional guys who really put high good expensive cameras on the drone which is again a good one, an expensive one, they typically don’t take it 1000 metre, they are also not allowed to do it below, but that’s their decision I guess. The CONNEX Mini gives you 500 metre, again for most of the applications more than enough, but if you need more, you’ll go for the full CONNEX. Once you’re going digital, apart from the quality, we have all kinds of indication that we did add there, to signal you when you’re going out of range, including telling you it very broadly on the screen “hey you’re going out of range, stop there and come back.” In the last year and a half since the CONNEX and then half a year since the CONNEX Mini was launched, we have not heard complaints about the range. Ed: Some people must have pushed that – they must have said “well if they’re saying it will go to a kilometre, perhaps we can push it to a kilometre and a half” if they’re out in big open spaces? Uri: Some people may try it … again we are giving a warning on the screen at some point, so we’ll tell you when to come back. And again, if you do want to go to more than that and maybe you’ll have analogue that sometime will work for you, go ahead, that’s fine. I can tell now very safely that probably 99% of the market doesn’t need more than 1000 metre. People learn that – and again, we’re talking professionals, not some players, and we have a product for the players also that goes for 4 kilometres. I’ll show you that later. But professional guys won’t take their equipment out of line of sight, and certainly not above one kilometre where you won’t see anything and you won’t be able to control and get it back safely. It’s just too dangerous for your equipment. Ed: Because that’s it, you can’t see what the drone’s seeing, so you don’t know really where it is? Uri: Well you will see with the camera and you can do it with GPS, you can bring it back even, but it’s just not safe – not safe to the people below it, it’s not safe for your equipment. You really don’t need it except really, really extreme companies that know what they’re doing and again this will be the 1%. Ed: So how have you designed the CONNEX system to be specifically targeted at the UAV or drone technology? Uri: Okay, so what do you get with CONNEX? First you get full HD, 1080p 60 for real. Secondly, you get zero delay – that’s very critical in this type of publication, so why do I need zero delay? Because I want to control the gimbal. I want to see and be able to control it immediately, otherwise I will try to control and aim the camera to where I want to shoot or where I want even to inspect in an inspection application, and it will take you 4 years until you will do and sometimes you will just record it, you will come down and you will watch the video and say “hey, I didn’t capture what I

The Mini is really small but powerful.

wanted. Okay, I need to go up again.” That’s a nightmare for someone that is doing something in the field. When we are talking about the range, one more thing … for us “range” means that along all this range the latency doesn’t change, it’s still zero latency, to be very accurate, less than one millisecond, that’s zero. And the quality doesn’t change, it doesn’t go down in the resolution as you go further. It’s not that you start in 1080p 60 and 100 metre later you’re at 720p 60. The reason is that the system is being used, not only for directors or someone who is doing inspection and recording it, but many are using the system for live broadcasting. Live broadcasting is a very big application for this product, because people just use it and broadcast from the air news, from the air covering all kinds of sports events, etc. Ed: It must also be lightweight, because of drone technology, the weight of the camera plus all of the transmitting system certainly is determined by the power of your drone and the battery life? Uri: You’re right. Again, the CONNEX Mini is only 60 gram, that’s including everything. Ed:

Not the battery surely?

Uri: Not the battery, but again you don’t need a separate battery for that. Ed:

Oh, you use the drone’s battery?

Uri: Exactly. It was designed to use the drone battery, so you can work with the drone battery. Ed: And how much of the power use of the drone would be taken up by the CONNEX system?

Page 33


Uri: It’s not even worth calculating, so if won’t affect the power; the drone is itself so power hungry because it needs to lift all this weight, the effect that we have is really not calculated.

you the same abilities we have today and much more even.

Ed: And there’s no point having the camera still going when the drone’s stopped?

Again, a bid to the future. It’s not a product for this year certainly; next year probably towards the second half we will start seeing something from us coming there.

Uri: Of course. That’s the CONNEX Mini; the full CONNEX is a bit heavier, it’s 130 gram. Again this typically will go into a heavier drone anyway. So the 130 gram again it will be of no concern for them also. Ed: And really the only difference is the distance that it can go? Uri: The CONNEX has a few more features, but again sometimes people will choose it because of that which is fine. Just to give you an example, the CONNEX will typically work in the 40 MHz band and, in the 40 MHz band, it depends on the region as to the amount of frequency channels that you can use. That will determine how many systems you can have in the same environment, or alternatively, if you have an environment with many interferences, you can find the right slot there. Now the CONNEX also has the ability to work in 20 MHz. That won’t give you full HD 1080p 60; it will give you 1080i 60 or 720p 60 in great quality. And of course you’ll get many more channels to use because you’re using only half the band for each one of the transmissions. So this is one quality of the CONNEX – you have the capability to fix channels, so there are a few more features in the full CONNEX and people choose what they want. I can tell you that our sales are certainly split between them, so we do see that people prefer one over another logically, according to what they need. Ed: And I presume it’s a fairly robust system. Do you know the height from which it can drop and still survive? Uri:

Oh, that is a good question.

Ed:

Is it plastic or aluminium?

Uri: Well both of them are aluminium. The Mini one is aluminium with a plastic cover. Just because it’s too small and we want to give it the ability to touch it, it’s getting a bit hot inside, again it works wonderfully, so we’re just being careful of that. I think that if this would fall from a place that the CONNEX would stop working, I don’t want to think what would happen to your drone at that height, or your camera of course. It will be a nice thing maybe to shoot from the outside, but I wouldn’t stand under it let’s say that. Ed: Now of course the buzz for the last year or two has been 4K and I know Go Pro produces a 4K camera although I’m rather sceptical that the lens is up to a true – and I say again, “true” 4K. Have you got ideas along the 4K line? Uri: Yes, we have a demo here that runs all the time which shows our next generation chip set. Finally, just to explain again, everything that we talked about until now, it’s our technology that developed all the algorithms and developed inside our company and we have a special chip set that we design and produce that is being used by us. So we control all the systems which is probably the reason that we are being able to configure it to exactly what we need and get this performance. In this case it’s a new chip set called the 4K, it’s our next generation. Again it’s in the lab, so we are calling it CONNEX Lab and we took it from the lab and you can watch it. The quality is amazing – again it’s 4K wireless so you can hear in this environment which means that it already can sustain interferences in a crowded environment, with amazing quality and give

Ed: When there actually is a market for a lot more 4K? Uri: Well, like every other technology, I do agree that 4K is certainly coming to the cameras and it is important for editing that’s critical even and gives you an advantage. If you do need it for broadcasting live events, the Olympics etc, we want to be with the trend. I guess it will come like every new technology from top to bottom. It will be expensive like everything new and it will come down and everybody will have it. It always takes more time than you think. Ed: Because I imagine that the development to get to 4K would also give you a parallel development not necessarily to a 360 camera view, but at least a 180 camera view? Uri: It’s actually 360. A guy was just here and I won’t represent him, but there are all kinds of solutions even today for 360 and finally they want to pack it and send it many times wireless, and they are coming to us saying "okay, you have the best wireless, because it’s 360 when they pack it, they want to pack it over 4K to get high quality but all around 360." So certainly that will be one of the cases for using these chips. Ed: But why do you need 360 with a drone, because you’ve got 180 degrees of sky surely? Uri: One of the ideas is to think about giving the audience the ability to watch where you are flying direct, but also go and look around and say okay what’s behind you, what’s in front of you. One of the nice products we have here, a new product that we just launched last month, is not for the professionals but the hobby guys who’ve now started to do drone racing and that’s an amazing guy that’s flying 100 kilometres per hour with drones. Now think about this as an audience, what does it mean that a guy is flying 100 kilometres per hour and you watch how he flies, not only forward, but you can look 360 and see the guy that is running after you and trying to pass you around the corner and cut you off. It’s amazing you know, that’s a new sport that is coming. Ed: Well it’s currently on MotoGP motorcycles – I’ve seen the effect on MotoGP. Uri: Yes, it’s exactly that. On MotoGP, they are doing most of it right now – well halfway it is being done offline and some of it is being done online. It’s amazing. NZVN

Page 34



Datavideo at IBC We’re at Datavideo for Protel with Valentijn Diemel. Ed: Valentijn, I guess I have to ask every time I come to Datavideo – “what’s new” – and you always tell me "lots" because, well, Datavideo is a company growing hugely and number 1 on Tyrone’s list as to who to visit for Protel, isn’t it Tyrone? Tyrone: Yes we’re seeing some real growth in Datavideo this year and long may it continue. Ed: Excellent, so where do we start? Valentijn: Just like last year, we’re still perfecting our “from go to whoa” line-up of product. This is the new SE-650 small, cost effective 4 input switching panel. It has 2 SDI inputs and 2 HDMI inputs. It supports up to 1080i and 1080p so it has a place anywhere in a small broadcast and an educational setup. It has a built-in 2 channel Chroma Keyer, Luma key, PiP, wipe generator, still store and tally. So you can do all the nifty tricks you like to do with virtual studio setups and still be on a very reasonable price level of only NZ$2,000.00+GST. Another feature is a built in 6 Channel Audio Mixer.

RMC-400 four channel replay kit.

Ed:

SE-650 on show.

Next to the SE-650 is a 4 channel replay setup incorporating 4 x HDR-10. You have 4 separate HDR10 replay recorders, so you can start with only one channel and build it up later to have 4 channels and, with this nice RMC-400 control panel, I can drop a mark -in point whenever something interesting happens; or when I just missed the action as I did over here when you see the swimmer, I can say "okay, I want to have a long replay clip." A long replay clip means that it goes back in time 12 seconds, I can still cue in and cue out my mark-in and mark-out point, and now I have my replay cued up. So whenever my director says “okay you’re live with your replay” I can push the T-bar and show 4 channels the same time, with a replay. I can control the speed; I can even use the jog dial to go back and forth. Whenever it’s done, it sends a GPI signal back to the switcher, so it automatically switches back to the normal broadcast. So again, this is a 4 channel setup which is really nice especially for the price point. One HDR-10 channel replay recorder sets you back NZ$2,167.00+GST. The RMC-400 for controlling all 4 recorders is only $832.00+GST, so for NZ$9,500.00+GST, you have a full 4 channel replay kit. Ed: Now if you only had one channel, you could choose which camera that you put into that? Valentijn: If you have a switcher in front of it you can select which channel you want to have. But normally, this is an inline recorder, so your signal out of your camera goes in and goes out to your broadcast chain to make sure that your replay actually works.

So you’ve got every camera recorded?

Valentijn: Yes. Ed: Okay, now let’s discuss fly away kits on display? Valentijn: Yes we have a couple of new models. We have the HS-1200 which is our most basic fly away kit. We perfected our fly away kit; this one doesn’t have an intercom anymore, which is not a setback, but actually, because these fly away kits are meant to be used with pan tilt and zoom cameras, with robotic cameras, you don’t actually need an intercom kit. Without the intercom, we could shave a lot off the price, so the HS1200 HD 6 Channel Portable Production Studio fly away kit is NZ$4,992.00+GST. That is a really competitive price level considering you get a full scale 6 input switcher with a dual chroma keyer, a lot of cross point options … Ed: But still you could use it with camera operators, you’d just have to have your own intercom system? Valentijn: Yes of course, and we have other flavours of fly away kits which still support a wired intercom system. One of the other novelties of Datavideo this year is HDBaseT. We actually made a pan tilt zoom robotic camera with HDBaseT enabled. Ed:

Can you explain HDBaseT?

Valentijn: HDBaseT is used to transport video, audio, DVIP Ethernet Control, power, and RS-422/232 control signal all over one Cat-6 Ethernet cable. So what you see in the top of our booth are pan tilt zoom cameras with HDbaseT output connected with only one Cat-6 cable each. HDBaseT can run up to 100 meter, we have the HBT-11 base station over here, which is breaking out the signal into HDMI power of course and

Page 36



Valentijn: Yes. There is one extra channel that you can use with traditional HDMI signals, so for instance, if you have 3 cameras and a fourth one is a laptop which you can use for graphics. Ed: So you can’t put 4 HDMI cameras in there. This is specific to Cat-6? Valentijn: Yes, this is specific to HDBaseT. You can connect another way though because our cameras are in the box with a breakout box to convert the signal from HDBaseT into HDMI. So you can still use them to feed to an HDMI switcher.

Fly away kits on parade.

control signals. So yes, if you’re installing this in, for instance, a church or some other building where you cannot have invasive infrastructure, you can use HDBaseT and just lay one wire which is 5 millimetres diameter … that’s really nice. Ed: That sounds like a winner for you Tyrone? Tyrone: Certainly, your readers should check out this blog which explains it in more detail – http://www.liveproductionblog.com/what-is-hdbasetand-how-is-it-changing-the-future-of-video/ HDBaseT is going to have a great future.

Ed: Well that’s what Datavideo’s known for isn’t it – innovative products that cater for many user options? Valentijn: Yes, of course, making existing technology smarter. Over here is again a new small scale HDMI switcher the SE-500HD. This switcher is a 4 channel 1080p HDMI. That means that this a perfect kit to use in any conference setup. For instance, if you have a conference with a large screen on the back, you can use 2 cameras to film the stage, and there are 2 HDMI outputs so you can send that to a big projector. There’s actually also a small audio mixer, built-in, with embedding options. This unit, again for a very competitive price point of NZ$1833.00+GST shipping next year, can do basically anything for you. Ed:

And this is the SE-500HD?

Ed: Well that’s the big thing isn’t it, stringing cables into an existing facility, that’s where the nightmare comes. If you’re building the facility to start with or in that stage of construction, not so much of a problem but, hey, one cable? Tyrone: Yes, it’s a winner especially if your building already has Cat-6 cabling in it or for new building installs it’s a much lower cost compared with installing and terminating co-axial cables. Valentijn: Now the last 2 novelties. We have the new fly away kit and we have HDBaseT, so we thought "why not combine them", so that’s what we did over here. This is a fly away kit that has HDBaseT enabled. That means that, on the back, there’s actually only 3 x RJ45 connectors. When you connect your cables to there you can directly connect them to your HDBaseT cameras. On the control surface, there’s a power distributor in there so your cameras are power fed through the Cat-6 cables. So we have a control surface with a very easy to use 4 channel switcher and we have a camera control server over here. This is just a regular camera control surface, so you can pan and tilt, you have a separate button for zooming, there’s per channel 4 reset positions. This is a complete unit to control a full robotic setup.

Valentijn: Yes. Actually, one of our workhorses which has been around for years and years is the SE-500. It’s still being used in a lot of places in Africa for instance; emerging countries are using analogue signals still, so they’re using the SE-500, and for years and years our distributors from South Africa have asked for an HD version which we now finally have.

Ed: And I notice there are also HDMI connectors there so you can use it in a traditional sense?

So next to that is the HDR-1; this is a compact H.264 recorder. It’s recording onto external USB 3.0 media.

SE-500HD and accessories.

Page 38


This is just a very simple recording unit – you just plug in any USB 3.0 media, it can be a thumb drive or a hard drive, press Record and it starts recording in 3 preset options of 15, 18 or 20 Megabits. This is a lot of quality for only NZ$832.00+GST. Ed: The major use of your HDR-1 would be as your recorder for …? Valentijn: A backup recorder next to your streaming encoder for instance, or just a video you want to use later for online use and you don’t want to go through the hassle of putting it on a media encoder. This is just recording it in a handable format. And also it’s using just a regular thumb drive, so if you have promotional activities and you have a promotion thumb drive, you can just record a bit of whatever is going on, then hand it over to your customer and you have a nice recording that anybody can play back on any laptop. Ed: And we have to point out the perfect accompaniment to the mixer and the recorder is a Datavideo monitor? Valentijn: Yes, a Datavideo monitor a TLM-170P. This is one of our professional grade monitors. This is a true 1920x1080 progressive monitor. Actually, the price point is starting at NZ$3333.00+GST, which might sound a bit steep, but this is conforming to all the tender needs that we see in broadcast tenders. A lot of times it states that a monitor has to have a minimum lifetime of 50,000 hours, so this conforms to all those demands. It also has tele and onscreen audio monitoring. Ed: And the pièce de résistance … what’s this, the KMU-100? Valentijn: This is the KMU-100 ( NZ$8,333.00+GST ) – our first 4K product. This is a 4K signal processor that turns your 4K camera signal into multiple HD Cameras. Ed: We did actually see this at NAB but it seems to have taken pride of place on the stand here, so obviously there’s been a huge amount of interest in it? Valentijn: Absolutely. Although it’s not a shipping product yet, there’s a lot of demand for it. What we did is we actually upgraded it a little bit since NAB. Before it was a one 4K input with 4 HD cut-outs. Now it actually is a 2 input 4K unit with 8 outputs. So the user can define a cut-out or he can define a motion pad cutout that’s being sent out to one of the 1080p outputs. Because we updated it to a 2 channel unit, you can also choose to have one input, 8 outputs, so it’s 2x4 or 1x8. Ed:

And when do you expect to have this shipping?

Valentijn: October or November, we’re starting up mass production right now. Ed:

So obviously a lot of interest?

Valentijn: Yes, there’s a lot of interest. If I had to say it, I really wanted to have this unit in July already, but now it’s going to be October. Ed: So Tyrone, what are the likely customer areas in New Zealand for this KMU-100?

Valentijn and Tyrone.

Tyrone: The KMU-100 is suitable for use in live events, for instance a press conference is a perfect example. A 4K camera source held above a crowd could then be stripped down to 8 individual cut outs with individual zoom etc; also places of worship, and broadcast applications where you would require fewer cameras to cover the same event. We are looking NZVN forward to it shipping soon.

Canon Cameras For Protel, we’re here at Canon with Paul Atkinson from Canon Europe and Tyrone Payne from Protel. Ed: Well I must say that at long last I have seen a large form factor Canon cine camera that actually looks like a cine camera? Paul: I told you to trust me last year, didn’t I? Ed: So what made Canon change their minds and actually go that way? Paul: As always, Canon is looking at ways to innovate and we produce cameras that the market wants – and the result is the C700. It’s the new flagship camera, aimed at the highest end of the market, high quality production for cinema and for broadcast. A lot of TV stuff these days is being shot in a very cinematic style, they like the Super 35mm type sensor for its depth of field properties and the look that it gives. Ed: I must say there’s not much else being made out there apart from that style of sensor? Paul: No there isn’t and the reason is that’s what the customers want, and also the market wants. Obviously, you’ve got the manufacturers who make the very large sensor camera that you can only rent from them anyway and a lot of stuff is being done on that and rightly so. Ed: Is it a case that the freelance users are expecting a certain form factor … because the freelancer doesn’t necessarily use the same camera each time, it’s per job, so to have a form factor that’s similar to the other major brands has got to be a good thing and really it doesn’t detract from the Canon technology that goes into it? Paul: Absolutely. I’ve seen the C300 and the C500 being used fully rigged up with all the equipment that you would normally expect of cine style equipment

Page 39


and all the other bits and pieces that get put on for major production and sometimes, it’s hard to find that there’s a camera body in there. The C700, because of the way it’s designed, is very compatible with incorporating it into the sort of standard equipment you’d find in that sort of film environment. Allied to that is the range of accessories that we’re going to be producing, including the shoulder pads, which means it can be used as a long form ENG run and gun type camera. Ed: But does it also mean to a facility that might have a lot of ARRI tack on gear … Paul: It is Canon through and through. Again, as always, it’s Canon technology, it’s Canon sensors, it’s Canon processors …

Paul and Tyrone.

Ed: No, no, what I mean is that some facilities have kitted themselves out with ARRI matt boxes and all sorts of follow focus things and all those bits and pieces, so will the C700 form factor allow more use of that tack on technology? Paul: Well for rigs and things like that, there’s obviously a very close partnership and in fact, if you look at one of the C700s that we have on the stand, you’ll see that it’s got a new ARRI produced and designed rig on it already. So there’s always good partnerships going on between us and the manufacturers of the supplementary equipment that’s so vital. Ed: Okay, but apart from the form factor, one would expect that going from C500 to the C700, there are a few other things under the hood? Paul: Oh yes in the C700 we can do 4K in our XFAVC format now up to 805 megs per second, which is in IT terms, known as a lot of data … Ed: And recording?

that’s

you like, with the C300. So in that period, we’ve gone through the C500 with the external recorder with cables, to where we are now with the C700 which will have a Codex designed and developed recorder which will dock directly onto the camera. You take off the V-Lock mount included with the camera, you dock in and then fasten it, it becomes an integral part of the camera. With a firmware upgrade that we’ll release in March-ish, that will be capable of 4.5K recording to a Codex drive and then integrating fully into Codex workflow. Ed: And if you’ve kitted yourself out with industry standard 4K lenses, it’s ready to go? Paul: It’s ready to go. We’ve got an EF model with the Cinema Lock, same as the C500, and a PL mount … they can be changed in a workshop, so there’s compatibility there. In January-February, we’ll also release a global shutter version of the camera. The major difference between those two will be the fact that

internal

Paul: That’s internal recording to CFast 2 cards. We’ve also included ProRes recording for the first time. Again, there are a lot of people out there using Final Cut Pro who want to be able to put it straight in without transcoding, and they’ll be able to do that for the first time on a Canon product. We’ve increased the options and variations on the colour gamma, the colour setting, Canon Log settings – the original Log2 and the Log3 are all on there, and overall, it’s bringing together all the bits of technology that we’ve developed. Remember, it’s only five years since the Cinema EOS line was born if

Left side C700 ... Page 40


lock setting, all working on the same colour setting, and that’s going to make the postproduction side of it an awful lot easier as well. Ed: Okay, that’s for the big user, but for the smaller user, there’s been a bit of an upgrade in the XC10? Paul: So we have the XC10 which has been quite well picked up now by a lot of independent filmmakers, journalists, documentary makers. Again, they recognise the image quality they’re getting, they’re getting full 4K internal recording all full HD if they want to. What we’ve done with the new XC15 is we’ve sort of improved the XC10’s professional credentials if you like. Ed: Like adding audio connectors?

… right side C700.

global shutter will not be able to do autofocus. The autofocus system on the 700 is the dual pixel CMOS autofocus. We’re finding with the 300 Mark II that serious cinematographers and camera ops are beginning to use autofocus, because it’s so good. And I know that face, I’ve seen that face pulled myself and I did exactly the same thing when I thought of … Ed: No, I think you’re taking my expression around the wrong way … I’m actually surprised that they’ve come to that realisation because to me, the camera can set a correct focus a lot better than I’ve seen examples on television of "cinematographers" trying to do it manually. Paul: One of the criticisms we had, especially on the XF205 which is a contrast based AF, was it’s too fast, it’s too quick, it’s not natural enough. So with the C700, our dual pixel autofocus system is slightly tweaked – you can change sensitivity and speed. With an EF lens, you can have face detection and face following, face tracking, but we’ve also got – that we introduced first on the 300 Mark II – the focus assist feature, that gives you a little GUI, arrows pointing up or down depending on which way you turn the lens, extremely well thought of, very useful, very effective. On the C700, you can actually have 2 areas, so if you’re doing a two-way or you’re having a conversation piece filmed, you’ll be able to select 2 areas within that 80% of the viewfinder, so pull in and push in focus between the 2 characters will just be absolutely spot on every time; as soon as it’s green, it’s in focus. Tyrone: Who are going to be the early adopters of the C700? Paul: I think the people who are looking at cameras of a similar form factor, but now want to be able to have 4K in a slightly more affordable format, who want to use our colour science and our image quality, and want to be able to integrate it with existing Canon products. The way that the colour science is set up on this camera means you could integrate really easily now into C300 Mark II and XC15. So for the first time, we can now provide you with an A camera, a B camera and a C camera, 2 of which are Cinema EOS products, doing internal recording to 4K in the same format. With external recording, you can go to the Codex recorder I mentioned earlier. So we can offer XF -AVC 4K in 3 different cameras, all working on the same

Canon

Paul: Yes. The MA-400 that was the accessory for C300 Mark II is now included in the box of the XC15, so that gives you 2 XLR inputs. We’ve also taken off the MP4 recording option of the XC10 and replaced it with the option to film in 24p, and the last one that we’ve fitted into what I was mentioning just now about being able to slot it in with the other cameras – you can now select shutter angle as well as shutter speed. There are some other little bits and pieces in there as well, things like you can lock the touchscreen just in case, because if your thumb moves across to it, you don’t want to be deselecting something, so we can lock that now. We incorporated the faster autofocus from the XC10 firmware, so that’s gone straight onto XC15

Page 41


as well. And again, it just makes it a more rounded product. As I said, it’s improved its professional use credentials. XC10 is still being used very successfully, professionally, for the right operator, but for those who just want that little bit extra, then that’s where the XC15 comes in. Euro price, you’re looking at around about Є2700 for that on launch. Ed:

And in the lens department, a new cine servo?

and, of course, you get the fact that they are colour balanced so if you do swap to a Prime or to one of the other cine zooms or cine servos, then again, if you’re going to try and find any colour shift, then you are going to have to get scientific on it, because they’re all balanced to match all the way through. So on a C300 Mark II, which we’ve got an example of here, it just makes a really nice compact camera. The lens has been likened to the DSLR equivalent of the 24-105mm as the "go to – walk around" lens, and for things like documentary makers, even run and gun with the Mark II, it’s compatible with a dual pixel autofocus system as well. So it’s a really good all round good quality cine lens at above 4K resolution. Ed: And obviously there’s a reason why you can make a more affordable lens than in the normal range? Paul: Yes, the 18-80’s maximum aperture is T4.4. Some of the Primes and the other zooms are a much wider T stop capability than that. So that’s able to make the lens a bit smaller and obviously because there’s less glass, it’s a lower price. Ed: But I guess that’s compensated for by Canon cameras especially being particularly good in low light?

The new cine lens from Canon.

Paul: We’ve introduced the 18-80 T4.4 cine servo. It’s a nice small compact unit that you can fit on any EF mounted cinema camera. You will find that you could use it on a DSLR as long as you’re cropping down the centre for full HD filming, but you couldn’t use it as a photo lens on that camera. This is giving users the opportunity to get into the cine servo lens family at a much more affordable level. This is much smaller, it’s much lighter than the other cine servos that we’ve got which have been doing really well, they really cannot make those things fast enough at the moment. But it brings you the other advantages of the cine lens range. There’s virtually zero focus breathing, there’s virtually zero zoom breathing when you’re operating the lens

G-Technology Storage For Protel, we are at G-Technology and we are speaking with Greg Crosby. Ed: Greg, you say that in G-Technology, “G” stands for “Great” so you’d put yourself at the pinnacle of the storage companies wouldn’t you? Greg: Most certainly. We’re part of a larger hard drive company, we represent a lot of different brands. We’re actually part of the largest storage company in the world now, today, Western Digital Corporation which is a mixture of 3 big brands – one is W.D and then HGST formerly attached to Global Storage Technologies who bought IBM’s hard drive business, who’s the inventor and creator of the hard drive, and then recently we acquired SanDisk. So we have 3 massive brands and what’s great about it for GTechnology is that G-Technology gets to take all that technology and innovation and put it into our products and solutions. Ed: Okay, and we’re looking at a very special large black box with a big handle on it? Greg: Yes indeed. That’s our G-SPEED Shuttle XL. This is up to 80 terabytes worth of storage

Paul: Yes, it’s one of our strengths that again has become recognised. We’ve got a very good reputation for being able to film in very difficult situations. Ed: Tyrone what do you think about these new Canon products. Tyrone: The new products have a great deal of Canon developed technology built into them and as usual Canon are aiming these products at professionals who want the very, very best image quality and don’t mind paying a little extra to achieve their cinematic look. We are already taking pre-orders for the 18-80 T4.4 cine servo lens and XC15 in New Zealand. No doubt there will be some interest in the C700 now it has been released. NZVN

capacity; it’s a hardware RAID solution so it supports all the various hardware configurations for data redundancy. It’s Thunderbolt 2 enabled, so you get that really high performance where you’re transferring at 1350 MB/s in RAID 0. RAID 5 is about 1200 MB/s. We also have other versions of it … what you’re actually looking at now is our EV Series integrated version. What that means is that, instead of all 8 drives being 3½ inch base to get to that 80 terabytes, because 10 terabytes is our highest capacity drive, we have 2 of the bays that are for our EV Series, or Evolution Series ecosystem. That’s all about workflow efficiency scalability expandability, where you can take different types of drive media – whether it be spinning disc or solid state types of technology – or we have some very unique things where we’ve collaborated with people like RED Camera, ArtemiS, we even have a CFast 2.0 reader where we’re optimising the workflow from the capture face into shuttling and transferring and moving that data around. What’s also really great about this product is that it’s designed to be transportable. So it’s got an integrated handle to be able to quickly move that around and then we also have worked very closely with our friends at Pelican who have a case that supports

Page 42


that product so you can carry it with you on an airplane, or you can ship it and/or check it in baggage. This is storage designed to come with you wherever you go, and to be able to capture content. Ed: Okay, so to put that in a nutshell, you’ve got 6 dedicated bays for storage, but you’ve got 2 bays that are flexible bays where you can, using I guess an adapter, plug in various types of media that you’ve got straight off your camera or whatever? Greg: That’s exactly right. The 6 bays are using 3½ inch hard drive storage, so it’s not solid state at this point. Ed: They’re spinning discs? Greg: They’re spinning discs, yes. They’re up to 10 terabytes of total capacity per disc, so you get 60 terabytes in this configuration or 80 terabytes across the larger version of that. Now what’s unique again about the Evolution Series Ecosystem is that you’ve got a variety of different drive modules and what you’ll see on the back of the drive modules is a SATA interface that is kind of unique and proprietary and you also have a USB interface. These can operate as standalone devices or go into a variety of different docking systems where you can take the storage in there. So the concept – just to kind of paint a little picture here – is that you can go directly from the ArtemiS recorder, as an example, and this is the media that we’ve worked with them and collaborated on … so you go directly from the ArtemiS recorder into the higher capacity, higher performance RAID system. At the same time you can back up all that content … Ed: Could you do this live?

Ed:

Can you see an application or two for this Tyrone?

Tyrone: Absolutely, because reliable secure storage is a critical requirement for every production, the size of the files just goes through the roof. Ed: It’s not just the storage though, it’s the management of that data isn’t it? Tyrone: Oh absolutely and I invite readers to take a look at the G-Tech website – there’s one pdf which is very relevant to what we are discussing now and it shows a great workflow. http://www.g-technology.com/sites/default/files/pdf/ evseries/g-team-clintonharn-workflow-r3-0916.pdf We’re working with customers to try and ensure that they’re backing up files and not doing things in the field that could end up losing all the data the creative people have just put together. I think there are a number of applications for this box around New Zealand – in fact in just about every Video Editing facility and obviously in the location zone. Ed: So that’s for the big productions, but for quite some time, you’ve had a smaller version, a dual recorder version, and that’s the G-RAID? Greg: The G-RAID, yes that’s correct. G-RAID is kind of the flagship product of G-Technology; that’s kind of where the company was started. It’s a dual drive solution and we now ship that solution in up to 20 terabytes of total storage capacity. It does preconfigure in RAID 0 so it’s all about performance, but what a lot of people do with that capacity is that they’ll just put it into a RAID 1, which is mirrored state, and they’ll backup all their content from that day’s shoot and then

Greg: You could, yes. And you can now do all that transcoding, create your dailies or your rushes and have a drive that I can eject and I can give out to all of the team members who are part of the project, and I don’t have to worry about shipping this for example. This can get put into a FedEx mailer or a shipper and get sent halfway around the world to continue the next phase of postproduction. Ed: So really, you’re putting management into one box?

all

of

that

data

Greg: Correct, yes, simplifying that whole process, not any need for a lot of cables and other infrastructure to support. It’s all in this one box and that box goes into a Pelican case. Ed: So even though you might have sent away a drive, you’ve still got everything on that original drive array? Greg: That’s right. With some productions, where they’re out on location for 7 days for example, but they want to be able to send that footage from that day back to home base so they can start the edit, this box will provide that capability from inside of it. Page 43

Greg from G-Technology.



it mirrors it so you now have protected copy along with all of your data.

our channels and how we can offer the product out to the market, we go through additional certifications to make sure that it’s compatible with all the various post systems out there in the market. That’s really one thing that we pride ourselves on – on a lot of our Thunderbolt products, we actually have the Thunderbolt badge, and for us to be able to present that, we have to go through that certification process, so it’s very important to us. Ed:

And to the customer?

Greg: And to the customer for sure, yes, they want to make sure that when they buy a G-Technology product, they’re getting the most reliable product that’s going to work and function for them. Ed: And we’re going to continue with Tyrone, and we’re going to talk about the G-RACK 12 EXP. Tyrone, how many gigabytes is this? Ed: But you could perform a similar operation with this in that those drive bays are removable, so again, you could make a copy of certain footage onto a drive, give it to somebody, FedEx it away …? Greg: Kind of. I mean, the G-RAID is more of a desktop kind of environment. They’re basically bare discs. We do have another dock that fits within our EV Series Ecosystem with that same functionality I was talking about. It is a 2 bay Thunderbolt enabled dock as well. It’s very small and compatible, so you can have that data replication component that’s onset, out in the field. And then you have your G-RAID which is your massive storage location to be able to store and backup all that content as well. Ed: Right – it’s just that there are other solutions there; what we’ve looked at is the pinnacle, but if your budget doesn’t quite stretch to that, there are other options with G-Technology? Greg: Most definitely. A lot of people will use everything from our single, you know, 2½ inch disc products that are bus powered, to the ones that fit in our Ecosystem. We see a lot of productions where they’ll use our G-drives which are single disc 3½ inch base up to 10 terabytes … they use that as a way of shuttling data. And then you also have the G-RAID which is that workhorse, not only for postproduction, but people use it as a way of aggregating all that content on set and out in the field.

Tyrone: Well the EXP comes in several storage sizes between 48TB and 120TB and it is the Expansion storage which you can add to the GRACK 12 Network Attached Storage (NAS). Ed: Okay, but why did this excite you? Tyrone: Well quite frankly G-Tech haven’t had a product like this ever, a G-RACK 12 ( NAS) … Ed: So this is a studio solution? Tyrone: Yes the G-RACK™ 12 Network-Attached Storage (NAS) delivers the ultimate in highperformance, centralized storage for small-to-medium size postproduction houses, TV/broadcast studios, ad agencies and in-house creative departments that use Adobe® Premiere® Pro, Apple® Final Cut Pro® X or Avid Media Composer® and other creative applications. G-RACK 12 brings G-Technology’s reliability, scalability, and studio friendly technology to centralized storage. streamline demanding media and entertainment workflows of 4K and above with a flexible 12-Bay server offering up to 120TB; an optional Expansion Chassis for adding up to another 120TB of storage; integration with popular non-linear editing (NLE) suites; and the latest Btrfs file system for better data management. NZVN

Ed: Now what is important is that this is Thunderbolt technology, but I understand that there are various flavours of Thunderbolt technology and you’re looking at the highest level? Greg: Yes, we want to make sure we build the highest reliable products out there in the market. All of our products have a 3 year warranty, some of them actually have a 5 year warranty and we need to make sure that our products are going to be working with today’s technology and obviously, as new technology develops, how it integrates with future technology. We do go through very stringent certifications, so Thunderbolt is a great example of that. It’s a very strict process to go through to make sure that it can function on all flavours of Thunderbolt, and even with some of Page 45


Sound Devices at IBC For Sound Techniques, we’re here at Sound Devices and we have Paul Isaacs. Ed: Now Paul there’s a product that’s of particular interest, it’s been embargoed until today, but now you can talk about it? Paul: We now have wireless iPhone and iPad control of the entire 6-Series of mixer recorders. That includes the 688, the 664 and the 633. Ed: That sounds wonderful but why would a soundie, who’s got this strapped to his side, need to have wireless control of it when he’s right there and he’s got knobs in front of him? Paul: A good question. It’s not just about guys who are running over the shoulder, it’s about cart users as well who often will have their console, they’ll have their recorder and they’ll have like an iPad or some IOS device there as well. But the primary reason for it is not so much about the wireless control, although that does free up physical movement and a little bit more flexibility as to where you stand and move in terms of controlling your gear – the nice thing about iPhones and iPads is the touchscreen and the actual real estate on the LCD, compared to the real estate that’s offered on a 6-Series recorder. Ed: Yes, that might be for your scales but surely, the value of the knobs is that you get that fine feel? Paul: Right. Obviously for mixing, there’s nothing that replaces a hardware knob absolutely, but when you’re doing things like entering metadata and editing metadata, which is a key part of a sound mixer’s role these days, to be able to type very quickly on a virtual keyboard, edit all the metadata really superfast, change track names, scene names, take names, add notes … Ed: As the sound is being recorded? Paul: As the sound’s being recorded, and even going back and editing previous takes; and once that’s done, being able to create a sound report all from a really nice touchscreen interface. It’s much faster, it’s much easier to use. Now obviously, you can do this with the front panel of the 6-Series, but to be able to do it from a touchscreen, a beautiful large-scale display like you have on an iPad or an iPhone, has huge advantages. Ed: Just thinking about this, would there be an application where your actual sound recorder with the boom and the microphone and the pack strapped to his side, could be physically doing the recording job, but someone like the audio director could actually be setting levels and doing things from the iPad at a slight distance?

Paul: Yes dongle – it’s a Bluetooth Smart to USB dongle that plugs in via a small right-angle adapter here, into the side of the 6-Series recorder. So the WM-Connect is the actual dongle; the App is called Wingman. It’s really simple to set up. You just plug your dongle into a 6-Series via the adapter – and there are 2 adapters that come with a WM-Connect dongle, there’s a right-angle adapter like this, or there’s a 6 inch cable. So if you want to run this dongle round the back you can. Once that’s inserted, all you need to do on your iPhone or your iPad is turn on Bluetooth. You don’t have to do any sort of crazy pairing shit – all you have to do is just start the Wingman App which, as

Paul: Theoretically it could be done that way. I don’t think that’s the normal situation. Production sound mixers generally like to have full control over what they’re doing, in terms of sound capture and the metadata editing. Ed: It’s not that they’re retentive but …? Paul: Well they just want to protect their jobs. They’re responsible, and they don’t want anyone to screw up on their behalf. So basically, what we’re selling is an IOS App – it’s called Wingman, and that’s a free App downloadable from the Apple App Store. To make it work, you need to buy what’s called a WM-Connect and this is like a small … Ed: Dongle?

The dongle and adapter. Page 46


I said, is free from the Apple store. Once you open your Wingman App, it’s going to automatically display a list of all 6-Series recorders in the environment, in the neighbourhood, that have one of these WM-Connect dongles fitted, and then you can go through the list and choose which one you want to control. So a Wingman App can only control one unit at a time, but you can very easily switch between which unit you’re controlling; just by going to this list, you can see all the different 6-Series devices that are in the neighbourhood. Now some people might ask about security. What if someone wants to jump on, turn on the Bluetooth on their iPhone, they’ve downloaded the Wingman App for free, could they just jump onto my recorder and start sabotaging my recording … I don’t know who would, but that potentially could happen. Once you’ve logged onto a device and you have control of it, that device will no longer appear in any list on any other Wingman Apps. So no one can get in, there’s a failsafe there. Now another nice thing is, if you’re in an environment like this and you’ve got multiple units which there often are on a set, how do you know which one’s the one you want to control? There is an icon I can touch and it will ID a unit so you can very quickly see which unit you want to control. Ed: Okay, so it’s a two-way conversation? Paul : Yes. Ed: You can see one way – yeah, which one’s are which etc. Paul: So that’s pretty much connection, it’s very robust. Here we’re on the trade show floor where there’s a lot of 2.4 Gigahertz going on.

Wingman in action in New Zealand demonstrated by proud owner Ande Schurr. Pre-order yours from Sound Techniques in the next delivery coming soon.

Overall, we feel that Bluetooth is a much more resilient and robust form of wireless in a crowded environment because it uses a much narrower band than WiFi. Ed: But you’re not sending any audio signal over that, you’re only sending commands? Paul: This is Bluetooth Smart otherwise known as Bluetooth LE, which has a low data rate compared to the full Bluetooth 4; it’s not at all suitable for audio real time. NZVN

Page 47



Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.