|
|||||||||||||||
Introduction All About REX: The NetShow Real-Time Encoder At the Event: Interface with A/V, Configuring the Encoder, Connectivity to the Server Configuring the NetShow Real-Time Encoder The MSBD Connection: Don't Mess with These Bits Back at the Ranch: A Bit About the MCM Server Summary Appendix I: A Sample Timeline for Planning Your Live Webcast Project Appendix II: Preparing and Posting Live Event For Later On-Demand Viewing Appendix III: Live Webcasting, Just the FAQs, Please Appendix IV: Additional Resources
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. The furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property. © 1996 Microsoft Corporation. All rights reserved. Microsoft, MS, MS-DOS, Visual Basic, Win32, and Windows are registered trademarks, and Visual C++ and Windows NT are trademarks of Microsoft Corporation in the U.S.A. and other countries. Other product and company names herein may be the trademarks of their respective owners. Introduction If you are reading this in 1997 and are planning a live Webcast event using Microsoft NetShow networked multimedia software, you may consider yourself and early adopter of a new and remarkably powerful technology. You may also assume that there are many variables at play in pulling-off a successful live Webcast event, many of which we will discuss in this document, many of which you may discover on your own. Because every event is different and poses unique implementation issues and challenges, do not assume that every variable you may potentially confront is discussed here in detail. While we do present many specific guidelines, we also attempt to give you an idea of the broad range of considerations involved in planning a webcast. For the most part, this document describes the scenario of Webcasting a live event from a location remote to the point of network service, encoding for both video and audio on site and distributed to the Internet at associated bandwidths. We take this approach because this is the scenario that presents the most variables and issues from the perspective of implementation. Procedures for other types of deployment, such as encoding for audio only, or Webcasting from within a corporate LAN, can be extrapolated from these discussions. On the technical side, the hardware and connectivity recommendations we make here are meant as guidelines that will provide for the best insurance in deploying a successful Webcast. If you have advertised your event and expect a wide viewership on the web, these guidelines should be taken very seriously. If, however, your event is experimental, or you are evaluating the processes described here, you may get by with lesser configurations. Lastly, please consult the NetShow Product Documentation and the NetShow Content Creation Guide that install with your tools, as well as additional Documentation found on the NetShow Web site at http://www.microsoft.com/netshow. Some Basic Terms and Concepts
How Video Compression Works Before going too far into process here, it is useful to understand a bit about how video compression works. The most critical step in creating video content for low bandwidth delivery is creating or selecting the most appropriate content for conversion. If you're new to this, you'll learn very quickly that certain types of content lend themselves to digitization and compression better than others. Great video feeds don't always make the best low-bandwidth feeds for the Internet. This is because effects like zooms, pans, fades, and certain types of transitions present particular challenges for compression algorithms. When compressing for low bandwidth playback, the codec calculates key frames and P frames. Key frames are fully rendered representations of the first frame in a new scene. Key frames are automatically generated by the codec when there is more than 30% change in the pixels from the previous frame. In this example, in a period of a little more than 7 seconds, there are at least 7 scene changes, so 7 key frames, or complete redraws of the frame, would be required.
Because there are so many key frames required to render this video content, the compression algorithm will have to sacrifice frame quality or the number of frames per second in order to get this to run over low bandwidths (like 28.8 Kbps). P frames are frames that describe only the changes between the previous key frame and the new frame. When less than 30% of the pixels in a frame change, P frames, or delta frames, are drawn instead of having to redraw the entire frame. Because there is less data in a P frame, they are inherently smaller than key frames and allow you to keep the frame rate and image resolution up. In this example, the changes between the frames are slight so in the same timeframe (about 7 seconds), only 2 key frames would be required instead of 7 above.
The point is that key frames are more data intensive, and more of them are required to render your video stream when there is lots of motion, or change between frames. You'll notice that your codec is being forced to generate a lot of key frames in a higher action scene when the frame rate seems to stick or stutter. This understanding of video compression, then, suggests that following certain guidelines in setting-up your video shoot can help you in producing a video signal that is optimally appropriate for streaming video delivery to low bandwidths. These guidelines are discussed in the next section. Setting-up the Video Shoot There are any of a number of possibilities in the type and nature of video content you may be intend for webcast. You may, for instance, have an important webcast planned, and have the budget to hire your own A/V crew to shoot the event to your specific guidelines. Or, as is often the case, you will be piggybacking on a video shoot that has been otherwise engaged for the event, leaving you with little in the way of control or input. If you have any input whatsoever as to the nature of the video shoot, and you have the intention of putting the content on the Internet, it's a good idea to convey the recommendations below to the director of the A/V crew. Again, what makes a great live video does not necessarily make a great compressed presentation on the Internet.
In fact, all of this discussion suggests a paradigm shift in producing content for web delivery, and an opportunity to internet service providers to shape new service packages around new pricing structures. If much of what traditional video services provide (skilled lighting design, direction and camera work, fully conforming NTSC video) are in fact superfluous to your NetShow requirements, perhaps the economy of providing for your own audio and video makes sense. For instance, if what would be best for your Webcast production are fixed cameras on tripods, one on a speaker podium or towards a lead singer's mic stand, and perhaps a second camera for cut-away shots, then provision of a/v services looses much of it's complexity. As for lighting, there are many circumstances where simply enough lighting is as important to a Webcast image as is a skilled lighting design. Consider these points in relation to the new price points of new digital three-chip video cameras that produce extraordinarily high quality signals. As well, consider the exciting developments of IEEE Standard 1394 and the extremely low prices of small, portable video switchers and audio mixers. It becomes a very real proposition that two people (or even one very savvy specialist) with a trunk load of gear can arrive at a site and produce and deliver content for webcast delivery at a fraction of the cost, and perhaps more appropriately, than a broadcast-style video crew. I'm not suggesting that "just anyone" can produce NTCS broadcast quality video. That's not what we're doing here. But as Webcasting increasingly proliferates, it would behoove any A/V service provider to shape service packages around the new economy and requirements of Webcasting. And unlike investing in broadcast quality video gear, you could probably finance this with couple of fresh credit cards. The bottom line is that a single camera trained on an interesting subject can itself constitute compelling web content, so provision for full A/V service as described here is not by default always warranted. All About REX: The NetShow Real-Time Encoder The Real-Time Encoder is the machine that takes the audio and video signals, digitizes and compresses them, and provides them to the NetShow Multicast Channel Manager server for distribution. It should be understood that if processing muscle is required anywhere in the NetShow system, it is here at the REX live encoder. Compressing live audio and video in real time is an extraordinarily processor intensive task, and the REX encoder should be your beefiest machine, and it should be solely configured and dedicated to this task. No other services or programs (except perhaps a NetShow client to view the signal) should be running while this machine is operating. And if you are running your encoder over the Windows NTr operating system, you should keep a close eye on your CPU usage. As well, the higher the encoding data rate, the higher the processor requirements become. So if your in a situation where you may be encoding video targeted for 56k ISDN consumption, then you should take the following recommendation quite seriously. Whatever machine you use, it's reliable encoding capabilities to your target data type and rate should be thoroughly tested and verified well before the event. We recommend the following configuration for the encoder machine:
At the Event: Interface with A/V, Configuring the Encoder, Connectivity to the Server Assuming that you do not have access to broadcast style satellite transmission and uplink facilities (and who does but broadcasters) you will be schlepping a computer down to the facility and encoding your Webcast on-site at the live event. When planning for your event, it is important to note that the NetShow technology and your task of encoding the video and/or audio signals represent only the nucleus of your broader concerns. While in the last section we talked a bit about the meeting of the minds with your video provider, in this section we will discuss the physical aspect of that interface. As well, we need to look at the other end: the very important matter of connectivity with your server. Nuts and Bolts: Setting-Up, and the Interface with A/V services You will of course need two media feeds from your A/V service: the audio and the video signals. These are delivered over separate cables. You can set your encoding station up anywhere in or around the facility (even outside in a tent, say, if that's your best access), but you must be in a location where the video crew can cable you these two signals. Ideally, however, you should attempt to set your station up at or near the location of the video switcher and audio mixer to put you in voice communications proximity with these services. It is important to arrive early at the venue, and to come into contact with the A/V provider at the earliest possible point. Just as they will be testing and tweaking their signal and setting their levels, you too should spend this time in verifying the integrity of your feeds. Do not assume that the video crew has ever dealt with this Webcast interface before. The important point here is to not assume that they will have the proper connectors to plug into the audio and video inputs on your cards. Plugging In: Video Even the lowest-end video cards provide at least two types of video inputs: composite video (an RCA input, probably yellow) and an S-Video input (a round input with multiple pins). S-Video is usually regarded as preferred, but it is more likely that you will be presented with a composite video jack as your source. This is fine for our purposes. Since you've assumed that the video crew has never interfaced with a computer before, you've arrived at the venue with the appropriate cable adapters. For instance, on the video side, always come equipped with a BNC to RCA adapter. This adapts the familiar coaxial cable that you'll very likely be presented with to the composite video input on your video card. I've seen very expensive productions shut down preparations while a tech runs to Radio Shack for lack of this two-bit little chunk of metal. After plugging in, verify the presence of your video signal at the input on your card by launching any capture program or similar devise that installs with your video card drivers. Simply launching this application should automatically display the video image present at the input (in the case of the Winnov Videum card, this is the Videum Capture utility). If you are not seeing the image, select Video Source from the Capture menu (or similar control), provoking a dialog that typically looks something like this:
Ensure that the Video Source is selected appropriately (composite or S-Video). This dialog also appears in the NetShow ASF Real Time Encoder (Edit>Video.). If your not seeing your signal, this is most likely why. If you still are not seeing your video, ask your friends on the A/V crew to test your feed through one of their monitors to verify its presence to the cable. It is important to note that when you are viewing your image in this capture utility (or similar application), you are seeing the image as it appears at your video input. This is the data that the Encoder will be compressing. You should take this opportunity to evaluate the quality of the image as it is being supplied by your A/V service. You should evaluate the image as compared to the image as shown in the A/V crew's monitors. You should be satisfied that the image is similar in sharpness, color integrity, brightness, and contrast to that in the video monitors. If it is not, there is an extreme likelihood that you have been supplied a bad video feed, and you should bring this to the attention of the A/V crew. The image they supply should be every bit as clean and full fidelity at that which they are taking to tape. Don't expect the motion in your preview here to be as fluid as that in the monitors (your machine probably can't reproduce that at these preview data rates), but the quality of the image itself should be similarly sharp and crisp. Plugging In: Audio Unlikely as it seems, audio presents many more variables when connecting to a computer than does video. While the only real issue with connecting video is the presence and integrity of the signal at the video input, working with audio introduces the variable of levels (loudness of the signal) that must be dealt with if you intend to optimize the quality of your webcast signal. There is one important variable often overlooked in this scenario of interfacing your REX encoder with traditional Audio/Video equipment. This is that the audio level regarded as a peak zero level (the loudest a signal should be allowed to get) in professional A/V equipment is substantially higher than the level the audio input on your computer regards as a peak zero level. Audio intensity is measured in dB (decibels), measured on a negative scale with zero at the top representing full saturation. Signals peaking above this value will introduce distortion into the signal, sometimes called clipping. A zero peak reading on a professional audio board is an actual electrical energy level of +4dbm, while your soundcard input regards a peak zero signal as having an intensity level of -10dbm. More useful knowledge is that professional audio is carried on a circuit comprised of three connectors (negative and positive signal wires plus a grounded shield) called a balanced circuit. The audio signals on our soundcard input are of the unbalanced variety, where the signal is carried over just two connectors. Not to introduce too much confusion here, but be aware that you will be best prepared on location if you are equipped with a means to "step-down" and adapt the audio signal coming from the sound board to your sound card with a simple external hardware devise as described below. A second important note here is that you should not expect the A/V crew to come prepared with the cable adapters to accommodate the input at your sound card. This input requires a stereo, 1/8" mini jack in most cards, and as common of a connector you may regard this, it is never used in professional A/V equipment. While not necessarily the simplest way to handle your audio signal, the best possible practice in our scenario here is to come prepared with a small audio mixing board to handle the interface between the A/V equipment and your encoder. The Mackie model 1202 is a startlingly affordable, extremely compact, remarkably versatile piece of equipment that will provide you with all the control over your audio signal you will need. Trim pots (gain controls) at the input allow you to attenuate the signal to your needs, and unbalanced outputs can be sent to the soundcard. It also provides level meters and a hardware means to control your audio level to the computer, which is highly preferred to the less-than-convent software controls. Alternatively, you may use any of a number of inexpensive attenuation devices available at Radio Shack and elsewhere made solely to match audio levels between +4db and -10db devices. Typically these will have a female 3-pin XLR input on one end for the signal from the audio board (the A/V crew should certainly be able to provide you with this connection) and an unbalanced output. Be sure to arrive at the venue with at least this simple step-don device and a cable adapted to run from its output to the input on your sound card. Setting Audio Levels: Worth Getting Right The best way get a reading of the audio level coming into your computer is by using an accurate software audio meter. A small audio meter installs into the Windows Recording Control panel with the drivers for some audio devices, but this meter is not scored, is imprecise and is inadequate for accurate level control. The best option to get a very accurate meter for setting your audio levels is to install Sound Forge (a first rate audio editing software package), launch it, and set your levels while observing the resident meter in the recording control dialog box. If you do not have Sound Forge, you can get this meter free by downloading the Demo version of Sound Forge from the Sonic Foundry web site (www.sonicfoundry.com). It may seem like overkill to install this entire application for this singular function, but this is a great, accurate meter, and its very important that you set your audio input levels correctly. The Sound Forge input meter from the record dialog is shown below. You should close Sound Forge after setting your levels to free up memory for the encoder. Note also that this meter uses the Windows audio drivers, so you can't activate the encoder with it running.
Figure 1: The input level meter from the Sound Forge recording dialog box. Whether using a hardware control on your mixer or the software audio levels control in the Windows Recording Control panel, your levels need to be sufficiently high, but not so high as to introduce clipping in your signal. That is, set your levels to disallow signals that peak to or above zero dB on your audio meter. One good idea is to synchronize audio levels with the A/V audio board source by requesting a tone be sent through the line at a -3 dB level. You can then set your levels controls to show this same -3db reading to this tone on the Sound Forge meter. Then to some degree you can allow the A/V sound operator to handle audio levels if you trust him to ride his level accurately. Similarly, if you brought your own audio board, you could synchronize all of the audio devices in the signal path. You should also talk to the sound engineer about how he is processing the sound. For instance, he may have installed a limiter in the signal path that attenuates a signal level above a certain peak. This is good situation, because it allows you to set your recording level sufficiently high without the fear of confronting unexpected peaks in the input level. Also, he may have introduced a low frequency equalization roll-off, attenuating the lowest tones below 75 Hz. This is a good practice, especially when producing audio for low bandwidth compression. This feature is available on the Mackie board discussed above, and should be used as a matter of practice. One more point regarding signal processing. If the audio engineer has applied an "reverb" effect to the audio signal, it would be preferable to your needs to ask for a "dry" signal that shows none of this effect. These types of effects overwhelmingly do not compress well to the lowest data rates. This discussion should further illustrates the importance of developing an early and cooperative relationship with the crew handling the cameras and microphones, and the importance of arriving early at the venue and running your setup and verification procedures while the A/V crew is doing the same. We have also emphasized the matter of preparedness. Becomes familiar well in advance with your equipment and particularly the Play and Record Control dialog boxes in Windows. Don't find yourself stymied by one issue or another at the venue, especially when a little practice and verification of your systems in advance may save you the grief of not having complete control over your systems as showtime approaches. Configuring the NetShow Real-Time Encoder Central to setting up your webcast is configuring your REX encoder. Based on advance knowledge of the content type you will be encoding, you should have a very good idea before arriving at the event exactly how you will configure your encoder. There is sufficient documentation on how to configure the encoder both in the encoder help files and in the NetShow Product Documentation that installs with the NetShow tools or as downloaded from the NetShow web site. Many of the concerns in setting up your encoder are the same as those for creating on-demand content, including choice of codecs, frame rate, image size, etc. In addition to setting your encoding parameters, you will need to select the source of your audio and video signals (the audio and video cards) and the software port destination of the encoded stream. Your choice of encoding parameters and codecs will be contingent upon the type of material you are encoding. For instance, your choice of audio codecs will be different if you will be encoding spoken word or if your content contains music. Music has an inherently more complex aural structure, and does not convey well as voice alone to the ultra-low data rate codecs. As well, the video encoding parameters and codec you select will be contingent upon the amount of movement or action you will expect in your frames. Audio and Video codecs also need to be considered in tandem. For example, voice-only audio converts quite well to some of the lower rate audio codecs like the Voxware AC24, which consumes only 2.4kbps of your target data rate. Although it sounds unlikely, using these limited data-rate codecs for voice-only material actually provides for a better video image. This is because if less of the combined bandwidth is consumed by the audio content, the encoder is allowed to encode video data to a higher percentage of your aggregate stream. The result: richer video. For more information on the codecs and their various characteristics, see the very thorough discussion contained in the NetShow Content Creation Guide. For further help in configuring the encoder, see the NetShow Product Documentation in the NetShow program group. Another great way to learn about configuring the encoder is to view the sample encoder configurations (.asd files) that are installed with the encoder software. To view these configurations, launch the encoder, select File>Open and browse to Program Files>Microsoft NetShow> Stream Formats. These sample configurations demonstrate recommended settings for various content types and target data rates. You may choose to use one of these outright, or adapt one to your particular conditions. When you have finally concluded what the best possible configuration is, be sure to save out your own .asd file (File>Save As) to provide for efficient reconfiguration of your encoder between sessions. One Last Tweak: Verifying and Fine Tuning the Video Signal from the Encoder Once you've established your connectivity to your video and audio sources and configured your encoder, we need to take a look at the final encoded product, verify it's integrity, and take an opportunity to possibly add one more level of improvement to the encoded signal. To view the encoded signal from the encoder itself, launch the NetShow Player, select File>Open Location, and type the following URL in the space provided:
msbd://encoder_name:port_number Where msbd:// is the streaming distribution protocol coming from the encoder, encoder_name is the name or the IP address of the computer the encoder is running on, and :port_number (preceded by a colon) is the port on the encoder that the encoded stream was assigned to towards the end of the encoder configuration process. The computer name and current port assignment can be viewed in the summary settings displayed while the encoder is working. This will bring up the encoded image in the NetShow player as it is coming from the encoder. It is useful to assess your image quality at this point. For instance, under some circumstances such as shooting low contrast images, or if there is insufficient lighting on your subject, it may be useful to tweak the brightness and contrast settings to your video card as a way to improve your final encoded output. A typical dialog for doing this is shown in the section earlier on Plugging In: Video. This dialog is accessed from the encoder by selecting Edit>Video. Note that this dialog cannot be accessed while your encoder is running. So brightness and contrast correction is kind of a half-blind process: view and assess the encoded image; stop the encoder; evoke the control; make brightness and contrast adjustment (of only a few points in any direction); close the control; reinitiate encoding; view the image in the NetShow player, and assess and re-adjust as necessary. Be aware that you can do as much harm as you can do good with these controls. If, for instance, were the brightness control set unusually low at your encoding monitor, then compensation to produce what you regard as a suitable signal may indeed transmit to your viewers inordinately bright. So only consider using these controls if you are confident with their functionality and some sort of correction is clearly warranted. Finally, there is one last step in this signal verification process. After initiating your stream to your server (covered in the next section) it is useful to establish an mms:// connection from web-connected NetShow player (using the standard NetShow mms streaming protocol) to your server. This, of course, is to verify that your server is delivering as expected, and to view your final content as your viewers will see it. Kind of an obvious point, but if you don't take the opportunity to see it, you won't really know that it's out there. This process would reveal any potential errors or problems stemming, say, from an error in configuring the server, and should be accomplished in plenty of time to provide for correction. This also implies the need for access to a Plain Old Telephone (POTS) and an activated account with an ISP. Provision of a line explicitly for this purpose should be part of your overall preparedness plan. The MSBD Connection: Don't Mess with These Bits With regards to the connectivity of your remote encoder with your point of service (a LAN or web enabled NetShow Multicast Channel Manager), there are several potential scenarios here, the most opportune of which is live Webcasting through your corporate Intranet. However, since most of our discussion to this point has been with regards to the, let's say, peculiarities of deploying a live Webcast, interfaced at the event with a video/audio service provider, remotely encoded and serviced to internet data rates, we'll continue in that vein. If you're having trouble with your LAN, call your systems administrator. And consider yourself lucky. Enter ISDN. Integrated Services Digital Network. If you already know all about this, you still may be interested in this discussion on ISDN, because we make some very straightforward recommendations on the nature of your remote encoder's connection to it's point of service. Nowhere in the production chain that we've described here is a connection more sacrosanct, yet fraught with inconsistencies and potential for failure. Irrelevant of your investment and preparation, if this hookup goes down, or data is stalled or interrupted in transit, then all your other expense and preparation is for naught. I believe I have your attention. If you have any serious investment in the staging and promotion of your live web event, then the nature of this connection can only be of one type. And that is a point-to point, dedicated bandwidth, Dial-Up AMI (Alternate Mark Inversion) or BRI (Basic Rate Interface), 128kbps and 112kbps respectively, dual channel ISDN connection. If you are planning an event from an oversight level and you don't know what that means, I suggest you find someone who does. But, if you're looking for the executive summary: It's a big, fat phone call, and it will definitely appear in your project budget. And for any computer systems and folks just coming on line with ISDN, welcome to the club. In my experience, few computer systems engineers have come into enough practical contact with ISDN to develop any real expertise. If your initiative is to find out more, we surely encourage you to do so. A confident understanding and command of the parameters of ISDN add substantially to the assurance of success for any webcast, and would be a requirement on any highly-leveraged webcast project. In fact, ISDN modems as setup and addressed through Dial Up networking are not inherently difficult. What is difficult are the variables in service provision from LEC to LEC (Local Exchange Carriers). Variance in service provisions creates potential for misunderstanding and difficulties in establishing connectivity. Use of SPID syntax (Service Provider Identifications) is inconsistent between providers. Temporary service, as typically employed in Webcasting, is often installed or provisioned incorrectly. Because of these variables and inconsistencies of ISDN service, we recommend planning and testing of your connectivity on a clearly planned schedule. A timeline describing the planning of a live webcast is provided in Appendix One, and a safe schedule for ordering and testing of your ISDN services is included. But the basic rules are this: Order ISDN installation both to the venue as well as at your point of Internet service in plenty of time to provide for a thorough testing opportunity. And that testing has to be scheduled in time to provide for problem resolution, and this may involve your LSP. We are aware that this is an ideal scenario, but have a plan to control the variables if these recommendations can not be reached. Obviously, variables such as schedule and travel are introduced. So we play the assurance card again. Launch your project with a confidence of your ability to control the variables as associated with the ISDN connectivity between your encoder and server. And this is achieved only by thorough testing and verification of all your systems, particularly in the area of ISDN connectivity, well in advance of your scheduled event. I know there are some objections to this strident insistence for dedicated bandwidth between the REX and the MCM server. There is, after all, particular expense involved. Clearly if you have existing web servers running, all sitting on the World Wide Web with plenty of pipe to spare, why invest in more bandwidth? I'll begin by saying that the MSBD stream between your encoder and your server is your "master" signal, the signal from which all other duplicates will be derived. You may assume the risk of commingling your msbd data with Internet TCP traffic to whatever level you'd like. But it is inherently a bad idea to mix up streaming media traffic with "bursty" TCP traffic, and you cannot necessarily expect them to co-exist in absolute harmony in unpredictable or oversubscribed traffic conditions. Any potential for the introduction of latency into the msbd stream needs to be avoided: In a worst case scenario, extraordinary data latency in the msbd stream can result in timeout errors on the server side. Not pretty. There are two scenarios where running the msbd stream over the Internet might be acceptable. We have had service providers describe to us trunk-level connectivity paths that provide them adequate assurance of msbd stream integrity. You may well be in a position to accurately assess the connectivity path for the msbd signal and arrive at a similar conclusion, provided with your assumption of the balance of risk. The nature of and investment in the webcast production also come into play. The fact of the matter is that you can establish an msbd stream between two 28kb modems and if your mad about just plain getting live content up on the internet, then use what you've got and get your material live! As well, if the intention of your first live webcast is experimental in nature, or to establish a "proof of concept," then exercising cost containment is certainly in your interest. But if you've invested to any degree in your webcast, weigh carefully the decision of the nature of your msbd connection against the consequences of failure. Back at the Ranch: A Bit About the MCM Server A few more notes regarding the encoder to server connection, here from the server side of things.. The connection is established between the two computers using Windows Dial Up Networking. Therefore RAS (Remote Access Service) has to be running and configured correctly at the MCM server. The dial-up connection is initiated from the encoder side. Once the connection is open, the msbd request is made from the server to the encoder to provide the output of the encoder as the data source from which to service Multicast or Unicast client requests that are made to the server for the streaming media content. Server configuration is discussed in detail in the NetShow Product Documentation. Suffice it to say that there are a variety of server configuration and distribution scenarios to provide for various service capabilities, including Multicast and Unicast. However, with our discussions here being focused on delivery of live NetShow content over the Internet, your primary concerns for server configuration pertain to Unicast. Configuring your server becomes particularly straightforward if you are not distributing service any further than this one machine. And you can service a substantial number of concurrent streams from one well-equipped server, well over 1000 on some configurations. When calculating your service requirements, your most pertinent questions pertain specifically toward anticipated aggregate bandwidth requirements. Calculating Estimated Capacity Requirements Everyone hopes that his or her webcast event will be wildly successful and popular. But when planning live events, many people are actually afraid that their events will be too successful. So, start early in planning by estimating the number of people who will access your site and its streaming media content at any given time. These numbers are related to how much effort and investment has gone on in advance of your webcast, as well as what kind of associated linkage to your event you have arranged. Some numbers to keep in mind:
It can be derived from these numbers that if you expect that your event will be successful and you are expecting several hundred concurrent streams, you will require bandwidth equal to multiple T-1 lines for adequate service. If you expect that your event will be a real chart-topper, and you will have several thousand concurrent viewers, your service requirements will be equivalent to multiple T-3 lines. Of course, you probably don't have that kind of connectivity laying about the pantry. Rest assured that there are a growing number of dedicated specialty Internet Service Providers who are experts in providing streaming media services who will be more than willing to provision you with all the streaming media bandwidth you need, for a price, of course. But considering that one month of your own T3 service may run you about $35,000, the prospect of outsourcing the hosting of your Webcast makes particular sense. A list of experienced, qualified NetShow streaming service providers can be found at the NetShow home page www.microsoft.com/NetShow). Another useful point when thinking about bandwidth requirements for a Webcast, it is very typical for people to over-estimate the number of concurrent connections they may expect. Internet users are an extremely fickle bunch with relatively short attention spans. It depends on the nature of the event, but server logs from Webcasts typically show a substantial divergence between the total number of views to an event and the peak aggregate viewers present at any one time. And it's this last number that you have to consider when calculating your bandwidth needs. Optimizing the Distribution System for Worldwide And Internal Intranet Performance One of great things about the NetShow 2.0 server system is the way that it accommodates load balancing across widely distributed servers. If you expect to have a huge amount of traffic in a particular area of the country, you can place additional MCM servers in that region that duplicate (or co-host) a stream originating from another part of the country. Similarly, you can place an MCM server within a company that is protected by firewalls, and redistribute the streaming data within the company using Multicast to ease bandwidth concerns if many people within the firewall were expected to view the event. This is actually becoming a rather common scenario, used within corporations for the internal viewing of important company functions or announcements. Hopefully, in reading this document you have arrived at the conclusion that the most important aspects of successfully launching a live webcast event are rooted in precise planning, preparation, systems testing and verification, and an overall understanding of the many variables involved. Few ambitions could be described as more interdisciplinary than that of live Webcasting. Of course, that can be said of many pursuits in web content development, but nowhere else do the principles of broadcast production and engineering, telephony, and computer systems and network administration become so interrelational. Few individuals hold levels of expertise across such a diverse range of disciplines, so Webcasting inherently is a team effort, with input and expertise coming from a number of players from your organization. But conversely, if this document has served to convince you that live Webcasting is just too big and expensive an effort, then we've failed to a large degree. In order to cover all the bases, we've stitched our story around the steps in planning and execution of what might be regarded as a full-blown commercial Internet webcast event. As I've suggested, Webcasting with Microsoft NetShow is nothing if it is not enabling. And any individual's capability to point a camera on a subject, encode a stream, and spin that stream off to a NetShow enabled Internet server is based upon their ability to comprehend and execute the steps involved as we've described them here. Appendix I: A Sample Timeline for Planning Your Live Webcast Project Many points are covered here that were not be discussed in the preceding text, such as preparation of PowerPointr slides if you are producing an advanced event that may include URL flips from pre-cached data. Six Weeks Prior to the Event
One Month Prior to the Event
Two Weeks Prior to the Event
One Week Prior to the Event
One Day Prior to the Event
The Day of the Event
One Hour Prior to the Event
During the Event
Post-Event
Appendix II: Preparing and Posting Live Event For Later On-Demand Viewing You can create on-demand content directly from the video and audio feeds during the event by saving out the ASF output. This can be accomplished at either the encoder or the server, although we generally recommend accomplishing this at the server as to require one less task from your encoder. Creating your on-demand content in this way has its advantages because it's quick and easy, but you may want to create on-demand content from a video tape instead for the following reasons:
The best place to learn about how to create NetShow on-demand content is the NetShow Content Creation Authoring Guide, which is available on the NetShow Web site (http://www.microsoft.com/netshow). Appendix III: Live Webcasting, Just the FAQs, Please
Question: How can I protect myself so if there are way more hits than I anticipated or am capable of hosting, we could limit the number of people accessing the content?
Question: If ISDN lines are absolutely NOT available at a venue, are there any other ways to reliably get the feed from the REX back to the server location?
Question: If a customer is afraid of getting too much traffic at a particular event, are there any things they can do to easily spread the traffic across multiple servers? How could they do this if some of the additional servers are physically dispersed? For example, how could they set it up so some of the traffic goes to a server in Seattle, and some goes to a server in NYC?
Question: How could a customer get their live feeds put on the MBONE using multicast in addition to their unicast feeds?
Question: REX seems to be configured for 110 Kbps encoding or lower. What if I want to encode at higher rates?
Question: In buying an encoder machine, which is more important to have, a fast processor, or lots of RAM?
Question: Dual processor Pentium computers for the encoding machine are in short supply. Is there any way to get around this recommendation? Appendix IV: Additional Resources Microsoft NetShow Content Creation Authoring Guide - http://www.microsoft.com/netshow/howto.htm NetShow v2.0 FAQ - Check to see if your issue is answered in the frequently asked questions guide at http://www.microsoft.com/netshow. Newsgroup - Microsoft.Public.NetShow is a discussion group for all versions of NetShow. Mailing List (Listserv) - In order to subscribe to the NetShow Mailing List, follow the appropriate instructions:
c 1997 Microsoft Corporation. All rights reserved. Microsoft, PowerPoint, Windows and Windows NT are registered trademarks and ActiveX and NetShow are trademarks of Microsoft Corporation.
|
© 1997 Microsoft Corporation. All rights reserved. Legal Notices
|