With active membership in excess of 350 million global users, Facebook claims the pinnacle of new-age social media; declared the “world’s largest” social networking interface, in current existence. Utilized for sustainable socialization, by an enormous populous; when polled, by leading technological researchers, more than 70% of committed users were unaware of the story shadowing Facebook’s privately held company’s creation — even larger amounts could not guess the target age for the founder, with only 34% recalling the original audience for the company.
Initially created in 2004 for students at the apex of continued education, Harvard; Facebook existed as the brainchild of Mark Zuckerberg — a student himself, merely twenty-years-old at the time of its creation. Dumped, in 2003, emotional Zuckerberg created Harvard’s Facemash; a college dedicated social ranking system of student’s attractiveness, similar to popular Internet website “Hot or Not” but with a great deal of differing features. Within its first four hours, Facemash generated 22,000 photo-views from a visitor list of 450, registering a success unforeseen; its popularity influenced Zuckerberg to expand his creation and his mind within the same semester. From drunken tech-orientated success, Zuckerberg begin a new semester in January 2004 writing code to a separate Harvard interface, dubbed as TheFacebook; modest in his feat, Zuckerberg released his code and notified only a few of his closest friends — reviewing the complexity of such code, it was suggested that Zuckerberg place his website on his dormitory mailing list of 300. A second web-based success was noted, in a mere twenty-four hours as TheFacebook grew to over 1200 unique members – all within the school’s network. Half of Harvard’s undergraduate population flocked to the college social network in less than a month after its campus based release.
Returning from his summer vacation, successful in his website Zuckerberg took in a new room mate, Sean Parker; Parker, former co-founder of Napster tweaked his friends code in preparedness to introduce it to Peter Thiel – Thiel, co-founder of Paypal, made the first investment into TheFacebook of $500,000. Within the same year of its creation. Word quickly spread, in the complex market of online social networking; through a tip Friendster made full attempts to buy the youngster out at a price of $10 million – an offer turned down by Zuckerberg and his co-founding cohorts — his dorm roommate and close friends, educating themselves in the profession of computer science. Quickly after turning down his first buy-out, $12.7 million was received in a second wave of website funding; as TheFacebook dropped an article off of its name to become Facebook — this chunk in web marketed funding increased Facebook’s overall value to above $100 million. Restricted Harvard solitary was lifted in late 2005, as Facebook transformed to include social connections from high school membership(s) — maintain founding objectives as a educational backed network. However; with the introduction of its largest competitor, Myspace, Facebook made a successful decision to open the network to those holders of an electronic mailing account over 13 years of age in late 2006 – sustaining itself fully, appealing to a global market in a changing user-generated social splurge. With this exposed decision, Zuckerberg was approached next by Yahoo with an impending offer that soured to $1 billion, capturing his attentiveness a verbal agreement was made to sell his prized network to the Yahoo Corporation; prior to signed paperwork came the rapid decline in the Yahoo stockholdings. Shadowed with interest, Yahoo came back to young Zuckerberg and dropped their offer by a considerable $200,000; a offer that was met sternly with verbalized “no” –- Yahoo would come to bid a third time on Facebook, for the full offer of $1 billion; putting Zuckerberg into leading press headlines as the “kid who turned down a billion”. Wise to his innovation Zuckerberg held onto the brainchild, to be awarded the title of fifth most valuable American online company in 2007; receiving a spurt of $240 million in additional funding by computer giant Microsoft — many assumed that this grand offer was Microsoft’s response to outbid rival Google. Zuckerberg would continue to expand his creation to include businesses, in 2008 and 2009, with Microsoft and Apple becoming two of the earlier names on this spectrum; creating a social network of personal and professional roles.
Closing out 2009, Facebook is at its summit in transformed success as it integrates popular sites and institutionalizes common trends, responding to the spinning profit of revolutionized traditional ideas of social media — a growing trend started by the year’s popularization of Twitter’s micro-blogging concepts. To insure it’s growth into the future years, Facebook works with a team of over 700 to instill the values of trends members want to possess on a socialization network — recent changes to Facebook were mirrored from Twitter, with previous changes mirrored from Myspace in prior years. It’s net of over 350 million global users continues to change each minute, reflecting members influences in acceptance of successful changes; while few stride behind.
Founded by Jack Dorsey, an American software architect and businessman, Twitter gained widespread world-wide notability just three years after being founded in 2006. Revolutionizing socialization on the Internet, through transformed social media, Twitter’s popularity was gained through free- marketing techniques made possible by mainstream mass media sources — this idea of free word of mouth marketing was greatly appreciated by a company founded on a small capital, now collecting revenue in excess of $400,000. Often called the “SMS of the Internet”, Twitter first introduced a largely accepted concept of “micro-blogging”; textual updates restricted to a mere 140 characters for a more digestible form of user-generated content share — allowing users various options to post their content including the popular web-base interface, third party applications, social clients, browser applications, website feeding, and mobile phone utilization. At the closing of 2009; updates, also known as “tweets” were currently being sent and exchanged between millions of users in a globally diverse interconnected society — this diversity allowing millions to come together in a properly structured social networking medium.
To date, more than 262 websites currently exist or have previously existed, operating with the same features and socialization networking objectives as Twitter; these sites referred to as “Twitter clones”. Revealing an intensified global impact, most of these clones exist in various countries in the native tongue of each; the influence of Twitter within The United States has inspired clones to appear in Japan, China, Germany, India, Spain, Poland, France, Korea, Hungry, Indonesia, Mexico, Singapore, Czech, Turkey, Portugal, and more… With the abundance of so many sites popping up in a global subculture, Twitter can be dubbed as a raving success; as everyone wants their own piece of the micro-blogging pie.
China remains at the top of the list with 27 Twitter clones and 17 like Twitter networking mediums, of past and present use; Germany claims second place with 21, Japan with 20, Poland and Hungary have six each, Spain and The Netherlands bare five each, France four, India and Russia tie with three each, while Korea has two. Mexico, Turkey, Indonesia, Singapore, Czech, and Italy are counties that have a single Twitter clone, in their native language; while the United States has been reported to have had more than eighty past and present clones. Yet, web research, baring the expense of exceptional searching yield a result that over 100 more clones exist; each with similar objective functions, a number rapidly changing as newly built sites are added to a complex puzzle — to be exact, 132 additional clones have been found, some of these sites have failed in misery losing the funding to stay open while others are functioning on a bare network of members. Even Yahoo, transformed for the greater good of its members; has even jumped on the cloned bandwagon to its Portuguese cliental.
Most Popular 250 Cloned Twitter Sites provides more details and links for the global acceptance of Twitter’s micro-blogging influence.
Globally popularized, in 2009, by mass mainstream media outlets Twitter’s micro-blogging marvel transformed an interconnected user-generated online society; in just a mere 140 characters. Three years, after its web-based introductive; Twitter rapidly climbed ranks to the apex of networking socialization — with jaded desires more than eighty sites have failed in success as Twitter generated rivals, dubbed as Twitter clones. With its trend, Twitter brought forth a host of new-age technological based household terminology; leading in concept and use the modern-day hashtag. With an objective to return a diversely created search request; millions of Twitter loyalists preface a single term with an American pound symbol — no longer just the universal depiction of the “tic-tack-toe” board, Twitter set unprecedented marks with the useful ability to create a search result from a convenient hashtag hyperlinking system. Be it from the conventional web-based interface, third party agents, cohering clients, or mobile life on-the-go; the hashtag has been effective in its text-book definition — often an annoying substance, few novice Twitter up-in-comers realize that a hashtag has both a proper and in proper functionality.
With regards to a Twitter based hashtag, users must remember to follow that of popular cliché “less is more”, sparing the usage. In the achievement of successful results, a defined “#hashtag” should be used to put influence on a search keyword; remembering that the select keyword should always be that of a major subject — once proper utilization is used, the hashtag will enhance user-to-user socialization, without threatening it. Both the misuse and overuse of a Twitter-born hashtag can be attributed to the loss of followers, simply resulting from annoyance — many quote that viewing textual writings, baring obsessive checkers is far from ascetically pleasing; thus leading to a dropped follower or two due to displeased eyes. Digression is highly advised when using hashtags on Twitter; typically this use should be reserved for proper nouns of common relation to the used subject. Remembering that a hashtag will quickly retrieve a search result for increased socialization; these keywords should be used with the mindset of increasing social networking. Established accounts, utilize many more hashtags properly; while new users tend to be those contributing to incorrect use — these users must be willing to correctly educate themselves for sustained proper use, to achieve proper results in socialization. In example, to gain desired socialization and give forth knowledge about social networking; users can always insure correct use by the hashtag of social networks, websites, names of technological gadgets, names of retail stores, and so forth — a user should never use articles, verbs, pronouns, prepositions and other parts of speech that are not nouns as a hashtag on Twitter.
In retrospect, many other social networks of twenty first-century utilize tagging via pound symbol; each network exists with their own definite proper use. On sites that bare more than the micro-blogging 140, more hashtags can be used for proper nouns; remembering that on Twitter hashtag use should be limited in account of the limited characters. With newly structured advances of Twitter’s traditional website search, hashtags do not need to be used to retrieve a search result; yet many third party mediums for search retrieve and record based on hashtag use. Again, digressions should be took in full account prior to attempting to use the metadata form of tagging, dubbed that of a hashtag; respecting the eyes of the reader.
Creativity, an astonishing trait of intelligence, is associated with the right hemisphere of the human brain; professionally tested, those showing an incredible lean towards creativity are often labeled right brain dominants, as opposed to the mathematical logic of the left hemisphere — the perplexity of the human brain, is that while using the whole brain; often one lobe is preferred in an influence to existence. Within a new-age globally interconnected society; the Internet marks a place of consensus for the diverse subcultures of right-brain, left-brain, and whole-brain dominance’s. Universally, logging on with various connective speeds; the revolution of social networking has yielded harmony, giving forth thousands of sites to appeal to the vast intelligences of the human mind — in this feat, diversity comes together to form bonds of global friendship; unlimited in the exchanges of popularized user-generated content. Dominated in it’s textual form; user-generated content of the newly defined social media generation constructs itself uniquely within text, image, audio, and video.
Recorded first in 1860, photo manipulation has taken on an extensive feat; digitalizing itself to become a increasingly popular form of user-generated personal creative expression — interlaced with textual, auditory, and/or video means. Objectively defined, photo manipulation, is the ingenious assiduousness of editing images for a glorified unique result; digitally creating both illusion and deception as opposed to correction and enhancement seen in simplistic photo editing form. The manipulation of images takes an immensity of technique and developing skill as well as the sensitivity of human creativity. Prevalent, is the concept of photo manipulation in twenty-first century socialization of user-generated content; associated with Adobe Photoshop, a highly glamorized tool in the trade, photo manipulation is sometimes referred to as that of photoshopping — merely meaning the editing of images in a digital manner. Interrelated, the term has been verbalized both personally and professionally to denote any form of digital image enhancement; such endeavors include retouching, combining visual elements in a single image, and/or image color enhancements — yet, by far is photo manipulation limited to just a mere three factors.
Over five million unique qualifiers can be found, on Google search alone; if a search is conducted for the team “photo manipulation” — reflective of a rather large net-base. Revolutionized from traditional objectives, twenty-first century social media have restructured existing sites while opening brand new bases for the blissful harmony of right-brained dominates; each uniquely varied site similar in ways of creating a source for liberated gatherings, communication, user-generated content sharing, and the intimacy of friendship. In depth, hundreds of Internet sites cater to the creative photo manipulation prevalence by offering a platform for photo editing; allowing users to upload an image and utilize various tools to embellish to their personal desires — similar, hundreds of freeware programs are currently in existence to be downloaded, so that a user can work on recreating images offline.
(Copyright © Social-Media-News 2009)
Photo Manipulation Recommendations:
Microsoft Picture It!
Corel Photo Products
Synonymous with the rebirth of innovational new beginning, 2010 has substantial shoes to fill; charted in technological records, 2009, departs with historical influence for successfully revolutionizing the fundamentals of social media — within a network of interconnected devices, perplexed to that of a globally diversified up-in-coming generation. Seizing headlines, Google made its presence known captivating universal attention in 2009; flourishing interest of a pristine member base while sustaining loyalists. Swarming the presses with a host of advancements into the future, both with attention to the past and present day Internet subculture; Google redefined technological achievements, successfully achieving exposure to its growing enterprise – without a stop in site, Google closes the year operating with the future in mind.
Google’s largest accomplishment for 2009, Google Wave; a personal communication and collaboration tool, for the focus of rapidly changing online socialization — announced in May 2009, at its annual Google I/O web-developer conference, this program first launched restricted to developers. As an amalgamation for electronic mail, instant messages, social networking, and interlinked web pages dubbed as wikis; Google Wave’s web-based services benefits users by professionally providing supportive assistance in collaborative real-time focus — the desired result achieved by downloading extensions for the personalization of the Wave application. With an extensive database of extended personalization techniques for the user to choose easily from, allowing easily for spelling and grammar can be ensured reaching outwards to forty computerized instant translations. Google Wave’s functional running foreseen similar to electronic mail transfers and the aged Usenet, however; instead of storing against user storage as e-mail does, information is stored on that of a central server — each textual document sent or received, known as a “wave”, backed by a iconic wave W logo in the Google colors. Retiring, also, the ideas of distributing user-generated content attached to an existing run of thread based content as seen on the Usenet. Sustainably accepted, Google opened Wave to an additional 100,000 members by building a stable invitation only foundation – giving existing members the ability to invite their friends, exposing the name for a greater good in the request of membership.
Google’s 2009 success reflected, also, on previously released programs; such as the 2005 release of web-based website aggregator Google Reader. In accordance with the worldwide revolution of new-age social networking communication; Reader enticed a new generation, maintained loyalists, and recaptured the attention of user abandonment with its full ability to read Atom and RSS feeds both online and offline – creating an expansive network of users who objectively needed to sort through and maintain content of interest, from all perspectives and directions. Google Reader, simplistic in its use, prompts members to add a simple website URL into the designated content location; once added, updates to that site appears in the personalized electronic mail simulated timeline — without limitation each member can obtain thousands of textual updates per day, sorting through each for their individual thirsts for knowledge. Newly adapted Reader features allured users in all target age ranges; which in return allowed this existing Google program future global sustainably — a reflective increase, in 2009, was also noted in the database of Google Mail (Gmail) accounts.
Crucial for Google’s widely accepted achievements in 2009, Google once again critically impacted the Internet with the name of Chrome; astonishing members, technological corporations, and those journalists subjective to industry news. Just days following The Microsoft Corporation’s release of Window’s 7, Google countered with its Microsoft Window’s restricted release of the Chrome (OS) operating system — expecting a fully encapsulated roll out to both Mac and Linux users by 2010. Patterned after the Chrome browser, the target objective for Google in this pinnacled release is that of minimalistic use — assuring speed, stability, and security to a global network of organic open source beta testers. Predicted to climb to the top of the charts in between mid and late 2010, Chrome OS operation is the design of three tier aspects; firmware, browser, and window management – each expected to rush the speed for the lightweight Google operating system. Google’s Chrome OS, is the mobilized target of users on the go; intended for notebooks, tablets, netbooks, and other secondary computing devices. In this feat, Google has partnered with nine technological corporations to design hardware; with giants on the project such as Hewlett-Packard (HP), Acer, Texas Instruments, and Intel expected to yield full product sustainably by maintaining product reliability.
At the 2009 years end, Google reclaims its place at the top of the charts; spinning a massive profit for its well sustained enterprise — which just a year prior reflected its noble attention from the universal term ‘are you feeling lucky’. Closing the year focusing largely on the next, Google puts its own spin on product innovation; adding Wave, Chrome, and Chrome OS to a copious list of reliable user-friendly clients — including the older release of Google Reader to that of Google Maps, Youtube, Blogger, and more globally accepted platforms.
With user-generated content on a rise within globally diverse social media generation, research illustrates that audio-podcasting is a vanishing technique; peaking in 2004 with its association to the digital technology of Apple’s I-pod. Inside a rapidly changing modernized technological age, videocasting has captured full attention; a method said to have murdered the once innovated concept of audio based podcasts. Nonetheless, audio podcasting continues to perk the ears of the late bloomer; on grounds well saturated, without additional social exposure the curious too often sink in comparison with significantly professional podcasts that have sustained exposure, dignity, and a large listener listing. While a great deal of new audio podcasting fish struggle in upstream current, very few reach their destination; a large percentage quitting their adventurous actions — floating towards videocasting or a downstream sweep of text based social media. Newly spawned fish should never take the bait, and quit their journey, but should know when to let go to sustain their non-textual dreams and switch to the recommendation of videocasting; a successful combination of passive and active media broadcasting — in addition, those afloat should be aware that successful audio podcasting can be pulled off with proper professionalism and that social media, giving forth unique backing. In the new-age of informational swarms; both the audio and video podcaster should be aware that a profit can be made from sharing user-generated content — simplistic, if each cast is kept professional with a large listener base.
Leading business magazine, Forbes, released structured tip for those willing to tread the saturated grounds of podcasting; forging ahead to a spun profit. Suggesting that first, one should go the easiest route; embedding advertising within the podcast — this very profitable method mimics traditional radio or television programming, working best for the podcaster. Second, it’s suggested for the professional podcaster to have two sets of channels on the right medium for podcasting; one free to attract the attention of active listeners and a larger market exposure, the other premium similar to the newer type of radio — professionally collecting a profit for those willing to invest into unique premium subjects. Forbes next speaks to its target business audience to push for podcasting enterprise models that offer special features to major customers; these features need to be appealing such as simplistic security for premium channels and so forth — striving to give the listener what is desired. Lastly, the most used profit spinning approach to podcasting success is including a assortment of advertising; easily done in a various array of ways from product embeds to the additional subject based advertisement — these four tips listed should be used on a personal hosted podcasting website, noting that sustainability is equal to that of profitability; one that can be much larger than that of text based freelance writing that is justified only by page views.
Accredited in its publishing of top quality work, from an extensive diverse member base; Technorati achieved twenty-first century nobility as one of the Internet’s finest blog-based search engines – acclaiming South by Southwest awards for both Best Technical Achievement and Best in Show, within the year of 2006. Distributing an organized display of content, based on popular subjectivity, daily Technorati updates yield hundreds of member submissions; stenciled by the best, articles forego strict editing procedures prior to their website publishing — reducing inaccurate error-prone gobbledygook so that the site maintains an exposed name of pinnacled excellence. Its outstanding database, comprehensively researched in mid-2008 to reflect 112,800,00 blogs, fetches a wide variety of desired search results within seconds; tangible in categorized subjects arranging from business, current events, entertainment, lifestyles, technology, social media, and beyond –- with new-age social media making a rapid contribution, in excess of 300 million social networking topics exist closing out 2009. Founded by Dave Sifry, an American software entrepreneur iconic within the social media revolution; Technorati maintains instituted principal objectives to successfully assist modern-day bloggers assemble, dispense, and emphasize their product — achieving socialization sizable to global proportions. Originating from two combined words, Technorati’s portmanteau encapsulates technology and literacy; appealing highly to those who delight themselves in the thirst for both education and knowledge.
Focusing on user-generated material Technorati looks to selective keywords assigned to each piece of content from members, a metadata marvel in the twenty-first century. This metadata qualifier functions to accurately describe the objective focus of search engine optimization through short words evocative to the highlighted blog; each illustrated tag authorized by the author in allowing his/her content to be searched and retrieved — on Technorati and prevalent Internet search engines. Allocating exposure for each author and blog, Technorati gains popularity for its expansive content ranking system — Technorati Authority.
Technorati Authority determines the number of blogs that have linked to the members’ blog, within a cycle; blogs that are sprawling with links gain a higher rank — within a system of 0 to 1,000, with 1,000 being the highest of Technorati’s authority. Those blogs achieving the 1,000 rating also are numbered one, to convey their influence within the universal blogosphere. Operational on a sophisticated scale, differing from that of prior years, Technorati new “Authority” excludes both web-based blogrolls; calculating blog data and categorization in addition to linked behavior for a longer cycle — successful, these changes were deployed in mid-2008 to yield a more precise computation. For Internet’s data travelers, a collaboration of leading blogs can be viewed on Technorati’s Top 100 and Blog Directory; two highly recommended sites for those who bask in new information on their pursuit of knowledge, education, and never before visited sites. Those blog authors wishing to have their Technorati Authority recorded, must first submit their blog to the Technorati site to appear within the substantial database; submitting ones blog and writing for the site are not of the same concept — however, most Technorati writers have their blogs listed with this medium; exposing themselves and their readers to the blogs popularity, rank, and influence. “Authority” ranking is often portrayed in itself as a waiting game; those blogs with top-ranked content are contenders to a superior rank in comparison to those with inferiorly linked to content — for example a rank of 155 would exemplify more linking than that of a blog with its rank number of 0, yet 155 is far off from the exclusive rank of 1,000. Those users who wish to heighten their Technorati Authory should pay close attention to the quality of their content instead of mass-produced quantity, self-exposure, tagged keyword choice, and should also write to a targeted popular interest. All in all a portmanteau in itself, Technorati’s elite site has popular interest in both writing and blog popularity; assisting the modern-day blogger with trend setting technological exposure — within its highly calculated system of operations.
Within the bounds of a highly successful and rapidly changing globally interconnected technologically driven society, user-generated content is within its infancy; surpassing, both, expectations and predictions of critical foretold analysis. Statically measured for 2008, one year prior to the Internet being inundated by social networking media, eMarketer recorded an abundance of more than 82 million citizens of The United States participated in some type of user-generated content sharing; a number the research group predicted would climb to 115 million by 2013. Popularized by mainstream mass media outlets, user-generated content in 2009 has surpassed those predictions; at a number indicative of an extreme spurt in innovative interest. Ahead of the curve, user-generated content has been vastly improved on; leading to new growth. This content is considerably limitless in how it appears to a global society of web-based users; a copious majority that of text, while the second most popular the category known as podcasting — two vastly different mediums, yet often overlapping channels in new-age socialization.
Modern-day podcasting can be respectfully defined as a method of sharing user-generated content by means of audio and/or video. Audio podcasting, has been reported to be a dying substance; with it’s peak in 2004 via I-pod technology. Only the most influential, innovative, and unique audio podcasts are sustaining their audiences; due to a proper established foundation, years of development, Internet exposure, listening loyalists, and also full utilization of new-age social media assistance — clearly authors of these audio podcasts have created a strong name for themselves and their work. Statistical evidence, indisputably, reports dwindling sustainability for newly created audio podcasting due to heavily saturated grounds. With the introduction of a prevailing social media market, audio podcasting has been replaced by twenty-first century video oriented podcasting or “videocasting”. The new-found phenomenon of video based user-generated content, a slick move into the future; has greatly been impacted by affordable video camcorders/web cameras of desirable high quality, live streaming interfaces, social media, and personal networking — popularize significantly by broadcasting channel Youtube. Equal in its substance, for both the audio and video podcaster, adjustments of a consistent blog and/or alternative socialization medium dramatically impacts prolonged performance.
In today’s changing global market statistics show that to sustain one’s audio podcast, adaptations are mandatory to insure one’s name from becoming obsolete; most have switched to the popular videocasting, while others keep their audio podcasts popular by supplementing a consistent blog — one that has no detours from their subject of interest, targeting a single area alone. Regardless, of one’s source choice, to be successful the best podcasters and videocasters must first engage their target audience; remembering that a viewer wants to see factual and accurate content without confusion or annoyance — broadcasting inaccurate information, subject switching, personal ranting, and/or dysfunctional podcasting methods result in the highest failure rates. A users audio podcast, videocast, and/or supplemental informational medium/blog must remain that of top quality; engaging and not confusing or disrespecting — as a confused or disrespected viewer is less likely to return, and may go as far to produce negative opinion orientated feedback. As social media grows, even podcasters desire to be active participants on a personal level; thus is is highly recommended a separation of personal and business for achieving maximum desired sustainability.
As creatures of habit or as flared interests in desiring something new; some are deadest to attempt that of user-generated content through the means of audio podcasting — for those, the following “Ten Steps for Success” must be followed for content evolution to be possible, remembering the enhancement of a additional blog base.
[1.] Purchase a Real Microphone – From the start of user-generated podcasting, the microphone has been a primitive feature for proper sustainability. To engage viewers ones podcast must be free from background noise, as background noise is a severe distraction to the listener. For the successful audio podcaster, this means purchasing a cardioid condensing microphone with a large diaphragm; these studio microphones are made equally for both the directionality and clarity of your voice — unlike omnidirectional microphones that add to the background noise by echoing sound from every direction. Successfully; a cardioid condensing microphone means shying away from built-in devices, headsets, and the convenient $5 purchase from Wal-Mart — to some a expensive purchase, typically starting at $100, these large diaphragm microphones pay for themselves in the end.
[2.] Position Accurately – To produce a desired result, you need to be close to the microphone; however if you are too close your position can prove to be too much — and of asinine annoyance. For the accurate positioning of mouth to microphone, typically you need to be one hands width away from the device in front; easily you can judge proper distance by placing your thumb to your lips, extending your pinky finger to softly touch the microphone. Proximity effects sound greatly, if you are too far away your voice will sound thin; thin voices result in sound amplification thus the listener must crank up his/her volume — leading some astray and without a second return. Secondly, 180 degrees on the opposite side of the spectrum; “the voice of God” is an undesirable result when you are too close to the microphone equipment — resulting in the quickest failure as listeners will be shocked by this negative over achieved amplification and once again in need of the adjustment of speaker volume. It is ideal to have a microphone that can easily be adjusted on a stand and setting at a 45 degree angle. Mounted microphones reduce the bumping sound that is conveyed, to listeners, when you hold your microphone with your hands — thus, it is never recommended in any podcasting means to physically hold the device.
[3.] Preparation Prior – A key aspect to audio podcasting is to be prepared but to never script your content. Scripted content conveys a stiff voice to the listener, easily picked up on that you are reading word-for-word the text in front of your eyes — you will want to engage in proper interpersonal communication, speaking naturally. Prior to each podcast you should listen to yourself a few times to eliminate the stress, through a pair of headphones; rest assured that the Earth will not stop revolving if it takes you a few times to get a picture perfect sound/voice quality result. You should never podcast, edit, and publish in a rush.
[4.] Reduce the Noise – Inside the home you are accustomed to the noise level, respect that the listener is not. The human brain is wired to filter noise, within the selective hearing process; unlike the remarkable human brain microphones have no filtering system and pick up on every minuscule decibel. You must be knowledgeable in the two types of noise that first must be addressed, this clamor environmental and signal noise. Environmental noise is the noise that comes from surroundings; such as the air conditioner, fan, children, neighbors, traffic, and even florescent lighting. You should take concern to power off as much as possible, if background environmental noise is still a problem try to move to another location like a closet with clothes; the clothes will act as a sound barrier against environmental noise that still comes through to listeners. Recording at night may help with traffic noise. Signal noise is made between the microphone and the recording device itself; the shortest possible cords need to be in use with microphones of quality these cords should be XLR cables that need phantom power and specialized equipment. If using different microphones for two different set ups, remember one needs to be powered off to reduce undesired echoing and dreadful ear piercing sounds.
[5.] Format – The structure of your audio podcast is referred to as the format; formatting is made up of form, topic, and duration. Repetition keeps listeners coming back for more; thus ones podcast must be in full repetition of form, topic, and most importantly the duration. Listening loyalists never want to tune into a podcast of three minutes and the next of twenty-two, nor a flock of content jam-packed into a podcasting site — this content of various subjectivity. Keep in mind that inserting silly and humorous commentary into your podcast may keep listeners listening, even if the subject matter is hated; judge the active listeners over a week period per podcast to determine the desired effect — this desired effect will be different for every podcaster, simply what may work for your friend may not work for you and vica-versa .
[6.] Communicate with Friends and Listeners – The best and most successful podcasts are those of a two-way dialog, where interactions occur. However, setting up these types of podcasting may be difficult for the novice and even intermediate audio podcaster; sound files must be attached, textual emails read, and taped feedback inserted — with a great deal of skillful editing in between. Always show your utmost respect for each listener, as audio podcasting must be done in real-time; the listener must take a great deal of real life time to become a loyalist — deserving of your respect for that time taken to hear your voice and content.
[7.] Prepare A Schedule – Podcast on a schedule that works best and are highly recommended; for example if you are a podcaster that has children record when those children are asleep or out of the home, if you know a train will be going through the area at 2pm record around this, if the upstairs neighbor is fighting with her husband wait until after they make up/shut up then resume afterwords, etc. Allow listeners to know, on each podcast, the subject nature of the next podcast; remarkable in its context, this will always keep the listening waiting and wanting — that is if you have a professionally built audio podcast, that listeners want to return to. While podcasting is a subscription medium or social media extra, you will always want to schedule podcasts to produce a desired result.
[8.] Relax Yourself Before You Begin – The most common mistake in the art of audio podcasting is talking far too fast, slow down and relax. Again, the Earth will not stop revolving if the podcast takes a few times to do prior in preparation; successful podcasting may indeed take upwards to six or ten times prior to being suitable to publish — never be overly confident in yourself. Allow yourself enough time to get the podcast to the level you want it to be on. Having unreasonable goals in life sets one up for failure, within podcasting this also holds true. For those who are new to the area of podcasting, it is recommended to never publish your first dozen podcasts; you will want to get the feel for the picture-perfect results — no listener wants to hear that you are new and “learning” nor do they desire to listen to a podcast of voice failures and scripted mistakes. As a new audio podcaster you will want to listen to your voice, many times over, as this will calmly sooth the nervous cracked voice and allow it to broadcast in it’s natural form — it is highly recommended that you listen to your voice though headphones.
[9.] Incorporate Other Voices – Bring someone into your podcast, as a secondary voice; mixing up each podcast to yield a desired listener result. A listener never wishes to hear the same monotone voice, over and over; having friends to help will prove to be a successful factor in audio podcasting. Separate the microphones, so not to hear the sound of ear piercing death and have at it, a conversation is far more desirable to hear then a single person monologue; listeners will keep returning if you have a group of friends visit from time to time, as voice extras; think of this as a “side dish” of improvement — each inserting a bit of humor for a atypical yet successful result. Having a side-kick will allow you to think of the next important statement you want to make, buy you time to clear your throat, take a drink, grab a snack, check the mail from your listeners in real-time format, or go to the bathroom.
[10.] Detail Where Possible – Always provide as much accurate and factual detail to each single audio podcast as possible, in respect for loyal listeners; resulting in the sustainability of listeners. Audio is an active medium, different than that of passive television broadcasting where sound is linked with image; you must create the story, in full, so that the listeners can picture the image in their minds — respect them by not having them work, to create your image for you. You will always need to draw the artistic picture for the listener to reduce shut off confusion. Podcasting without detail leads to the podcast to be a waste of time, as no listeners will be so kind to sit through a confusing non-detailed podcast — always remember to have fun and enjoy yourself while doing so, which will broadcast your natural ability blissfully to the listeners ears.
(Copyright © Social-Media-News 2009)
As the year quickly counts down to 2010, Twitter continues to roll out new features; each assisting in the micro-blogging age. Released November 17, 2009, the same day as Google’s new operating system, Chrome OS; attention was drawn away from the explosive new “geotagging” feature resting on Twitter user’s door steps. Increasing local base socializations, this brand new roll out allows users to embed their location within the 140-characture update sent globally. Users who select the feature will also send their geographical information; personalized per user, those viewers to these updates will notice a new iconic pin embedded at the bottom – simplistically, clicking on the icon will revert to Google’s mapping technology to revel the whereabouts. Expected to receive its own user controversy, this roll out comes on the heels of beta test “project retweet” and follows the globally enjoyed “listing feature”.
Geotagging, is strictly optional so that Twitter can insure user protection and privacy. To enable these geographical Google based iconic tags, a user must first enable the geotagging feature in the account settings — saving changes. It is highly recommended that each user contemplates their Internet safety prior to enabling such feature; as once enabled millions of users globally will have a map to the door step — not recommended for those wanting to maintain their social media privacy, the common criminal, or for users under the age of 18. Included for those who feel this new attribute is safe for them, Twitter provides the ability to be selective to which Tweet bares the mapping; this selective use is controlled by the user as is the dominance over location history — which will only be displayed until deleted, also by the user; however, as the feature advances Twitter insures that data will only be stored on their database for a certain length of time before it expires. Users must also take note that even when deleting the data, it may continue to be displayed by third party applications existing for everyone to view — thus it is warned when using this feature, to use extreme caution and thought out digression.
Released, only for mobile third party clients that also serve their roles as applications, geotagging allows users on the go to inform their network where they are at any given point in time; an amazing feature for business marketing, if users select this while shopping at a retail location. Mobile users on the same network and in the same city may also find themselves in the same location; giving them the ability to meet and greet each other — while Twittering about their meeting, of course. Once again, mobile Twitter users must keep their Internet safety in mind; as stalking can become obsessive and life threatening. Those who enable the service of this new feature should use it at their digressions; turning it off as soon as malice happens. One must also keep in mind that even if location data is removed, it may still continue to exist on various third party applications; opening the door for millions to visually see the locations.
(Copyright © Social-Media-News 2009)
Fidgeting nervously the morning of November 17, 2009, three men took to the stage within a small California auditorium to unveil twenty-first century innovation; these men co-founder Sergey Brin, engineering director Matt Papakipos, and vice president of product development Sundar Pichai — the company Google, the release one of a brand new operating system “Chrome”. With acclamation to the remarkable surge for Google’s Chrome browser; its 40 million globally diverse users paved the way for Chrome’s full code to be opened and promptly released, marketing from a previously known household name. Its initial release restricted to those who use Microsoft Windows; with a sustainable future assured, versions for both Linux and Mac will be rolled out prior to January 1, 2010 — Mac will be first in this queue. In addition to this pioneering feat, Google, also announced the Chrome browser will powerfully be impacted; with browser extensions, simplistic to Mozilla Firefox and Internet Explorer.
In accordance to product development, Google’s operating system’s main principal motivation is speed; also the key concept to the release of the lightweight browser that bares the same name — Chrome. With both simplicity and speed in mind, Google’s goal is to revolutionize the traditional computing experience by attuning it similar to a user’s television viewing. Fully tackling modern ease-of-access Chrome OS will feature speeds atypical to other existing operational peers, achieved by removing the bulk that weighs systems down — Chrome OS’s subordinate Chrome performance speeds will increase dramatically, to yield a quicker start to Internet navigation — shattering and shaming the competition.
Google’s newest technological advance was carefully built to respect new-age life on the go, a target for notebooks and mobile tablets – the convergence on these modern day technological gadgets is expected to net Google millions, funding for the already existing multi-billion dollar enterprise. Separate from Android, Google Chrome OS also will be that of open source; the future potential of such products, that do not have licensing fees and bulk, may be a result in the possibility of a cheaper and greener computer — nevertheless, a quicker one. Chrome OS is expected to create an “improved” computer sliding in the laps of users who want quicker performance; surged to the whole computer. Globally assisting those users who desire their customized technological creations, that are one or two-year-old, to run as well as the first day out of the box; while demolishing the patience a user takes when booting up — providing the quickest route to e-mail usage, user-generated content creation, viewing digital data, playing games, and most importantly surfing the Internet. Transformed, Google Chrome will win hearts by offering a operating system free from constant updating crushing Microsoft’s Windows products; whilst clinging to the souls of the computer savvy generation by providing a worry free program that always maintains that data files stay save and preserved for the long term.
(Copyright © Social-Media-News 2009)