- And right now, right here, YouTube and Google are complicit in that system. The architecture they have built to extract the maximum revenue from online video is being hacked by persons unknown to abuse children, perhaps not even deliberately, but at a massive scale. I believe they have an absolute responsibility to deal with this, just as they have a responsibility to deal with the radicalisation of (mostly) young (mostly) men via extremist videos — of any political persuasion. They have so far showed absolutely no inclination to do this, which is in itself despicable. However, a huge part of my troubled response to this issue is that I have no idea how they can respond without shutting down the service itself, and most systems which resemble it. We have built a world which operates at scale, where human oversight is simply impossible, and no manner of inhuman oversight will counter most of the examples I’ve used in this essay. The asides I’ve kept in parentheses throughout, if expanded upon, would allow one with minimal effort to rewrite everything I’ve said, with very little effort, to be not about child abuse, but about white nationalism, about violent religious ideologies, about fake news, about climate denialism, about 9/11 conspiracies.
I wrote an application in HyperCard many years ago, that took text files of word parts - prefixes, suffixes, roots, etc. - and combined them to make "words." Then it ran the results through a couple of filters to eliminate words with common oddities that English doesn't like - three consecutive consonants, Q without a following U, X without a vowel on both sides, etc. - and output the final list in a semi-randomized order. It created both fascinating and terrible words. The combinatorics were interesting to watch, and the edge cases were where things got really weird and unexpected results appeared. That became my favorite part of the program! That was in, like 1990, or somewhere around there. It was a blunt instrument that would occasionally spit out excellent results. (If a client liked one of the words, they could buy it for $60k, or something. Not from me, sadly... from the company that paid me $500 to write the software for them.) We like to think our computers are so smart, and our tools are so amazing. But honestly, all we have are seriously blunt hammers. We just have millions and millions of them pounding away, like the proverbial monkeys on their typewriters. All these CGI studios need is for ONE of their videos to "hit", and it pays for the army of monkeys. Generate 10k videos and upload them programmatically, and the numbers pencil out. But, as KB says about Google's algorithm changes, the smallest tweak can completely disable an entire industry. It'll happen. And the content farms will iterate the next generation of their content engines. Over and over. This is not content for humans. It is content for computers.
I just got a chill. Considered abstractly, these are the dreams of the internet, right? Images and sounds and ideas created for it's own amusement at the conscious request of no human entity. With some fudged definitions, that just might sound like a half-convincing sci-fi plot.This is not content for humans. It is content for computers.
It assigns intent where there isn't any, which is a lot of fiction as well - apophenia is one of the driving mechanisms of our conscious and subconscious thought processes so no wonder we see patterns where there are none. Technically speaking, "dreams" are the wargaming of your subconscious. As the acetylcholine rinses away the neural connections that aren't strong enough to survive, it triggers an unconscious replay of all those that are and reinforces the memories and experiences that have value to us as organisms - positive, negative, fight, flight, love, hate, good bad. A dream is your unconscious psyche having a fire drill over something it thinks you'll have to deal with so that when it happens, you'll be ready. A bunch of different connections firing, some washing away in oblivion, some strengthening through repeated views... I can see the similarities. What's different is that there's agency to the dream. There's a greater focus. The whole reason we're having this discussion is Alphabet refuses to be that greater focus. They don't want to be the curator. As such there's a neurobiological similarity but it is not the dreams of a conscious thing. It is the twitches of neurons without a greater brain.
This statement makes clear the intents of some dreams and make others' entirely less clear.A dream is your unconscious psyche having a fire drill over something it thinks you'll have to deal with so that when it happens, you'll be ready.
That's the part that freaks me out though. The fundamental architecture is there. It doesn't have a purpose, but I keep feeling the word "YET" clutching to the end of that statement. I wonder what the electrical signals of the first proto-nerve cells looked like in early multi-cellular organisms.It is the twitches of neurons without a greater brain.
Is it, though? I feel like calling Alphabet the main or the sole perpetrator would be like calling Al Qaeda the only terrorist group or the US -- the only democracy. I'm not denying Alphabet using the rules already established, but I think the problem is not with people using the system, but with the system itself being what it is. YouTube -- for this particular example -- uses its vast analytical capacity to rabbit-hole people into a journey they didn't "know" they wanted to go on. It's its usage of the primal instincts ("I want more of what I already like") to drive its business model that makes the whole idea malignant. Can you blame them for wanting to do what they believe to be good service. Unlikely. Can you blame them for mangling the definition of what a good service is? I'd say you must. Abusing others' instincts for your profit is so animalistic, in itself. It is as if a predator has discovered the idea of traps and is now using it to get rid of its social hierarchy competition by luring them into a place of no return. Capitalist business model is driven by greed, another primal instinct ("I want power"), on a scale one could never even begin to grasp -- such is our human limits to empathy. If you could only imagine, for a moment, the grief and the misfortune you're causing to others through your services, I'm sure you'd think really hard about any possible reason for staying in the loop any longer. But the animal part of us is not going away any time soon. The delvings into the horrible parts of the human condition makes one sure of that -- a feeling we insulate ourselves from in the daily life, because the misery would drive us crazy. It's not bad: it's just how things are with us, animals as we are. But we -- the collective humanity -- have built a civilization not doing the same thing: we've built it on doing better. That insulation we provide ourselves in also serves as a guide: it tells us that the primal is not normal, and that there are advanced ways to do things -- ways coming from centuries of research, experience and fighting for what one thinks is right. My feeling is: the goal here is not to stand up to the bully and to the abuser. The goal may be to stand above them, by keeping one's conduct a level better. What that might mean in practical terms is yet beyond me, but doing better by example seems to me like a good way to do things. It's like Marianne Williamson said: "[A]s we let our own light shine, we unconsciously give others permission to do the same". Doing better breeds doing better.
It is. We're talking about video platforms that shape content through algorithm, rather than human taste. This is only necessary when the amount of content is so vast that there is no moderation. This describes one platform and one platform only: Youtube. What's described in this article is basically the malignancy of the algorithm. This is not a problem Facebook has because their videos are not only targeted and siloed, they aren't aimed at children. This is not a problem Vimeo has because their content is barely browseable. This is not a problem Netflix or Amazon have because their content is licensed and curated. It's only Youtube. Google, for its part, revises its search algorithms regularly. They kill entire industries in the process. That something such as a "private blog network" can be profitable and lucrative (as opposed to paying for Google Adsense) says something about the primitive nature of their algorithms to begin with... but it also shows how keenly interested Google is in refining things when it affects their profits. Youtube is not profitable. It probably never will be. Alphabet's approach with Youtube is to make sure it's the only channel anyone ever thinks about uploading content to, and when they start paying you back they're cosa nostra secretive about the terms. You basically need a million hits before they'll give you a dime... which means all the videos shown aren't, really. What we're observing is an algorithmic shotgun blast in an attempt to wrest profit from an algorithm that says "if you see six seconds of it, even by accident, it counts as a view." Alphabet could fix this by not being so awful. Nobody else is as awful as they are. It's an Alphabet problem.Is it, though? I feel like calling Alphabet the main or the sole perpetrator would be like calling Al Qaeda the only terrorist group or the US -- the only democracy.
I can see your point. I was arguing from a much higher perspective. If you put it the way you did, though, it isn't as applicable as the case-specific details. How would they wrestle such a change, were they interested? More importantly, what would be the reasons that could nudge them to become interested in it?Alphabet could fix this by not being so awful.
Not allow any upload to be viewable by children until it's been reviewed by a human for basic content guidelines. It would be as simple as not allowing any upload that uses licensed characters not directly owned by the uploader, and not allowing any video with disturbing or age-inappropriate content. This would cut the uploads down by a factor of a hundred or a thousand. Okay, fine. Reimburse them for views at 10x or 100x the rate of general Youtube content. Congratulations. You're now incentivizing professional studios to create children's content and disincentivizing the off-shore CGI farms. This shit gets so easy once you ditch the libertarian "we're dissolving the boundaries between creators and consumers" mantra Silicon Valley clings to like a goddamn bible. Guess what, choads - I want a fuckin' filter. I want some curation. Know why I let my kid pick whatever she wants off PBS Kids? It's f'n curated. The fact of the matter is, when you refine your platform to make it reward abso-goddamn-lutely anything that gets uploaded, you'll get shit like Elsa burying spiderman alive.
That would be a massive undertaking in terms of human resources, though, wouldn't it? How many videos can one watch in an hour?
It totally would be. Check this out: The first three lines make money. the rest never do. Youtube sells ad content on it but you get zero remuneration for a video with less than a million views. For a million views, you get a thousand bucks. Now watch this: Let's hire an intern. Let's pay her a decent wage - call it $20 an hour. Let's throw some overhead on her ass - Let's call it 100%. Now let's assume that she can spend half her hour watching videos, and half her hour reporting on and categorizing the videos she's watched. My girl costs me $80 an hour to curate Youtube videos. Know what, though? I'm gonna shoot the moon. I'm going to give her a boss who manages nine of her friends, and I'm going to pay her boss $50 an hour. She's going to spot-check and silo the stuff her interns watch. She's got 100% overhead, too, and adds nothing to the process, but she's only 1/10th of the cost. My intern and her boss together can crank through 30 minutes of content an hour at a cost of about $90. Now, producer-of-children's-videos. I'm going to charge you $3 a minute for every video you upload to my service. Remember, I'm paying out better - rather than $1000 for a million views, I'm paying out $1000 for 100,000 views. Hey, just to not be a dick how 'bout I reimburse you your review fees when you cross 10,000 views? That's a third of the way into the tail on my graph, deep in the nobody-cares-about-it corner of Youtube but you're break-even, even on your shittiest product. So. Is your 3 minute video worth $9? Is your 20 minute video worth $60? If you're a human, fuck yeah it is. You spent all goddamn day on it. I've worked on professional channels (Smosh etc) where the burn rate is closer to $3k a minute. You wanna charge me $12 to upload that thing? take my money. But if you're a machine? Poof. There goes the tail. There goes your algorithmic weirdness. And fuckin'A you just created jobs. You're not claiming one video is better than any other, you're just vouchsafing them against weirdness and copywright infringement. You know, that shit you used to get from basic goddamn public access television before the Aynrandians took over entertainment.
Those $20/hour is exactly why it's not going to happen. This website says: there were 400 hours' worth of videos uploaded on YouTube every minute back in July '15. That's 24k hours an hour. That's almost a million USD an hour spent on interns alone. At least if you flush that money, they might end up popping up in someone's pocket.
Ahh, but grasshopper - we're not screening every Youtube video. We're screening every youtube video that wants to show up on Youtube Kids. We're going for that razor-thin crust that actually makes money, and we're saying "we'll make you more money, but you have to pay an entry fee." Google is already great at this - modifying the AdSense platform a little (and then giving it human oversight) would do the trick. zero point three three percent of youtube videos ever make a million views. We'll ignore the colossal barrier to entry charging any fee raises and presume we're shooting for that 2.69% that actually matter. We'll also go really wide and assume 20% of Youtube content is going to end up on Youtube kids. So now we're looking at 0.53% of your 400 hours per minute. We're at about 2 hours per minute of children's content for policing. My humans can do half a minute per minute. I need a team of 120 interns and 12 supervisors (technically I need three times that because I need three shifts but I'm not paying them all at once). My interns cost me $80/hr and my supervisors cost me $100. My whole team is burning $10k a day - add all the infrastructure you want, my total Youtube team is under a half million dollars a year. Alphabet pulled in 90 billion dollars last year. And remember - we're not doing this for free. we're charging the uploaders. Actual cost to Alphabet is zero.
How soon are you getting your part of the revenue from the idea?
They'll never do it. Alphabet/Google is all about the democratization of data and becoming taste-makers is anathema to them. Nobody else can do it because they don't have the network. Friend of mine is a DMCA mercenary. He works for a company that media organizations pay to scour Youtube for copyright-infringing content. If you pay them to find your content, they look for your content and serve up takedown notices. They're one of many companies that do this. Note that Youtube could do this and make the companies much happier - this is an industry that exists solely to make Youtube comply with the law. But Youtube won't because then they have to be responsible for their content. Thus we come full circle: the problem on the Internet is Alphabet because they fundamentally believe they should not be responsible for the content they display.
I see your point. Devil's advocate says that they did start as a search engine, ultimately irresponsible for the things they display: they merely process your query and put out what they've found. The question here is: to what degree are they actually responsible? If you create a platform for people to share things, how much filtering do you do as the owner of that platform? I think the trade here is user engagement: if you actively fight those who post copyright-infringing material on their services (say, YouTube, though Google the search engine is a big contestant here as well), you're alienating views, and losing views means you lose a part of your audience. They don't want to be seen as infringement-happy platform, so they'd take it down because they have to; otherwise, keep those views coming, and maybe, you'll stay for that another appealing piece of video material we have for you... Alphabet is the face of what many cyberpunk writers feared back in the day. You can't just sue them for copyright infringement, even if you can make a solid case of them allowing it on their services. You can't fight them at their game, and you can't make your own game, either. They'll play carefully to take whatever they want, and they're too big to fail at this stage. It's exciting a real-world piece of worldbuilding -- but also terrifying.
Google started out as the un-Yahoo - Jerry Yang & Co hand-curated links and made their curation available to browsers everywhere. Google measured in-bound links and weighted results that way, assuming that the sites that more humans linked to were the sites that more humans wanted to view and used an algorithm to find that connection. It worked really goddamn well. But as the internet has grown more sophisticated, Google has presumed that the sophistication can be best tracked with a more sophisticated algorithm. There is no aspect of Google's culture that presumes a conscious choice will give better results than a well-refined algorithm. The dumb thing is all they have to be is compliant with broadcast standards and practices. They refuse to do this because then they become responsible. As such, third parties have to police Youtube for violations of standards and practices. At the same time, they profit no matter what happens. There are no third parties policing Youtube for creepiness. And here we are.
I've already expressed a part of my views on the matter -- one concerning the conduct, and Alphabet's conduct, in particular -- in a response to kleinbl00's comment. Now, I'd like to focus on parenting that's another part of the issue. I'm not a parent, myself. I have no children, and have barely had experience with others'. But I think a lot about it, because at some point, I want to have a family of my own. I think about the things I want to teach my child -- or, to let my children learn, as it were -- and "just let them watch YouTube videos" is one of the more obvious points that comes up often as I look around at what other parents do. It is unsettling how many parents are willing to let their children imbue themselves into escapism and instant gratification from such a teachable young age. The children learn -- but what do they learn? That their purpose is to be quiet and not interrupt their parents' unfulfilled lives, wasted away on constant chatter and yelling at the skies for not giving them what they want? That if they're sad, they can always go on the Internet and find something that will soothen the pain, despite not nearly healing it? A child can learn incredibly quickly -- that much is to no dispute. They're the ones who are going to inherit from us, still young enough, the technological advancement, -- and they will run with it, way quicker than us, ourselves being reduced to the old people we so love to snicker at. I'm okay with that. I will fight for my children's right and ability to interact and integrate with technology, even if it may reduce my part in their lives. It's satisfying to be the source of information for your child -- they treat you as a bag of holding full of useful knowledge -- but that era is already past. I'll still be there for them as a parent, and I will try my best to answer their questions, but if they can learn better from someone else -- either someone with more experience or simply with better charisma -- then, other things being equal, they should. That being said... Technology, like any part of our lives, is never black and white. It's never about whether it brings just the good or just the bad. There are positives, but there are also caveats needing to look out for. Parents are not machines. Like the author has said in his article (paraphrased), "if it brings parents some relief, then OK". The contemporary stressfulness of living, exaggerated by many sources as it may be, still takes an inevitable toll on the person. We all need rest from time to time, and taking care of a child at a very young age can be a test of one's limits indeed. I would rather live on Red Bull for two years than to let my child see me as mere gift-giver and homework assistant. It's my reponsibility to bring my child into the world to the best of my knowledge and effort, and I'll be cast to the deepest pits of Hell before I let myself off this bumpy ride. To give them a smartphone streaming cartoons is an insult to what this little person can become. So many young men and women can become great later in their lives simply because of what their parents left them in the first few years of their lives: the discipline, the curiousity and the tenacity to overcome the obstacles they'll inevitably encounter. This, however, raises a question for me personally and, I think, for many new parents around the world. We grew up with a different set of entertainment issues, and none was as subtly powerful as the modern Internet's ability to capture one's attention. What do we do with this particular problem? How do we solve it? Simply not let children near the Internet for the first eighteen years? That's absurd. Helicopter over your child, checking everything for malicious contents? You're not letting your child live, lady; get out of the class! What, then? Parental controls, only letting your kid to the trusted sources? And who would establish which sources to trust? You? You're a cretin when it comes to the vast information field of the Internet. Even the most educated psychologists are stumbling: how can you do better? Do you even have to do anything? My instincts say I do: not because I must control everything that's happening in my kid's life, but because I'm more equipped to deal with the dangers of the modern world. I've lived some, I've seen some things. The degree to which I should do so, however, as well as the venues which require careful observation are beyond my current understanding -- and that's scary. Nobody wants to mess up their child's life -- even though we all inevitably will, whatever we do.
There's a lot of "other people are the problem" in this article that I really didn't care for. Couched in the "I don't have kids, but" is a seething judgement of a straw man that exists somewhere and allows the author to clutch his pearls about the poor choices made by others that exist in the argument largely to be disdained. I am a parent. I'm surrounded by parents. I've got nine goddamn birthday gifts to buy between now and January and I've seen dozens or hundreds of children I know interact with technology. Not just in a "I saw a kid in a stroller watching videos" kind of way, but in a "I was one of those horrible people drinking wine and talking politics while my friends' kid sat there watching The Human Centipede in his crib" kind of way. I've never met a single person who lets their young kids watch Youtube. I've never met a single kid that would give a fuck about Youtube. Check out the Netflix kids interface. Check out the PBS kids interface. Check out the Amazon kids interface. They give you curated collections of videos and clips arranged around the faces of branded content your kid recognizes. They can figure it out by 3. More than that, kids aren't the slightest bit interested in fresh content. They'll watch the same Super Why episode nine times in a row. When you're talking about fresh young minds seeking out things to watch, they've got a really limited search ability and an impressively unlimited ability to navigate an interface to find exactly what they want. The content under discussion here is basically SEO blogspam in video form. It largely exists to find people who browse for it accidentally, and to find automatic playlists created by algorithm. If a kid happens to find it they'll look at it for a few seconds and skip it (or ask for someone to skip it). Most importantly if it doesn't have the voices they're expecting, they lose interest immediately - not discussed in the article is the fact that children's videos are inanely talkative and that the kids want the voices and songs. That's really the worst part of all this - it's a failed experiment. These videos aren't even targeting kids, they're targeting algorithms that may at some point target kids but don't right now because real kids are busy watching the same Daniel Tiger clip over and over again. And Daniel Tiger don't carry a scythe.I'm not a parent, myself. I have no children, and have barely had experience with others'. But I think a lot about it, because at some point, I want to have a family of my own.
If I may summarize your view on the matter: The author's talking nonsense about the children's interests, and you shouldn't worry about it Was that correct? EDIT: Also, thank you for sharing your experience with actual, living children, as opposed to an image fitting the narrative.
No. My view on the matter is that Youtube Kids is ostensibly a silo for children to watch videos, but as evidenced by this article, it's largely a silo for algorithms to compete for views based on inputs that have nothing to do with children. My view on the article is that the author assumes that if these videos exist, there must be children watching them and my own experience suggests that's not an easy leap of faith to make. This is literally a video expression of this: There are no children capable of finding this video who are not old enough to go "whoa. Pregnant hulk roundhouse kicking pregnant Elsa. WTF?" The formative age where you wouldn't want your kid to see this isn't looking for it. My kid's favorite video at 2: My kid's favorite video at 3: She's not weird. Sit through a Teletubbies episode; the young kids mostly want calmness and introspection. This is why little kids love Mr. Rogers. They gotta start being into fart jokes before Sponge Bob becomes interesting.Superheros Pregnant Soccer Balls Fidget Spinner Spiderman Joker Hulk Cartoon Funny Kids Video Pranks
I'm afraid my kid might like some of the weirder shit if she could find it. She's and odd duck. Mostly she likes to watch videos of kids and animals dancing and a videos made by a pretty slick kid friendly magician. I only casually supervise her YouTube experience. She's probably seen some inappropriate stuff but not terribly inappropriate and kids need to be exposed to some left field shit so that they don't crumple up like tissue the first time they get hit by stiff breeze. The absolute worst thing she's seen is when I was trying to find a stream of the despicable movie Jumanji for her. It wasn't being streamed on any of the crap we are subscribed too, so I started looking at putlockers. A decidedly XXX rated add came up on one of them. Weeks later she said "Dad, I've been thinking about what that man was doing with his penis". Baaaagghh buuurrrggeee gaaahhh, fuck my life.
My process goes something like this: Is it on Netflix? No, then Is it on Amazon? No, then Is it on PTP? Of course it's on PTP. Can we wait? No? then Buy it on Amazon. Jumanji is dope. Way too old for my kid at the moment, though. We watched Princess Bride over the weekend and that was almost too much.
My kid could watch Robo Cop and not flinch. She is a hard little monster. She really, really, really wants to watch scarier horror movies. She loves Goosebumps, Beetlejuice and Coraline and knows that there are way scarier things that she would love to consume. For the life of me I can't think of many 'safe' horror movies for her at age 6. We watched Army of Darkness, which she declared was the best movie ever, and Night of the Living Dead. Wish I could think of others that aren't too horrible or sexual, maybe the Dark Crystal, it gave me nightmares as a kid but I think she would eat it up. Personally neither me or the wife are horror fans. As an aside, I think my kid might watch and hour or two of YouTube a week on average. It's not a big passion of hers.
{Grain of salt - I don't have any kids either, nor much interaction with kids}. My initial thought is that it's a much larger question that just parental controls. You bring up the topic of entertainment issues, which is true that they exist. How we seek out entertainment, what we seek out, and the impact it has on our/our childrens lives is an interesting question. Step one could be controlling what technology and at what times technology is available to children, allowing the curiosity and entertainment to be derived from our physical world with technology as a supplement. PBS, some cartoons, etc. etc and figuring out how to mold the entertainment controls around that subset of content.What, then? Parental controls, only letting your kid to the trusted sources? And who would establish which sources to trust? You? You're a cretin when it comes to the vast information field of the Internet. Even the most educated psychologists are stumbling: how can you do better?
I'm disturbed, but by this post much more than anything depicted in it. Whether he's aware of it or not, he is arguing for censorship, political and otherwise. I'm a parent, and it's my responsibility, not Google's or a content creator's, to filter what my child sees. If we start asking Google to regulate what we see, our what our kids see, we may wake up one day and not like the results.
Sure, that's misleading. But that means we shouldn't trust it, and should act accordingly. Moreover, it's ridiculous IMO to say "bots have run amok and we can't control them, so we should get more bots." It's the 21st-century equivalent of bringing some new species to an area to control an old one, and we've seen how well that works.
I read this article the other day. I've seen these videos and the entire system is just messed up. I think YouTube needs better restrictions on their "Kids" content. Personally, I think I would curate the content my kids watch, but when there is an autoplay function, it can really lead down a dark hole.
A cyber security breach can certainly have a lasting impact on a brand. At DigitalXRAID, we believe that these risks can be significantly reduced by consulting and acting upon the advice of dedicated, highly skilled experts. Penetration Testing is the practice of assessing and evaluating a computer system, network or web application to expose vulnerabilities that a would be attacker, could exploit. https://www.digitalxraid.com/ No-one wants to arrive at work to a barrage of angry customers, whose personal data has been compromised, with the press at the door, as we have seen so much of in the media.
Well, it's possible they were entirely automated, I've seen some papers taking a script and generating an animation from it. I don't think anything like that is available off the shelf though and it would be way more trouble/money to write a program to do it than to hire a few Full Sail animation graduates to make half-hour animations that fit some word salad as fast as they can.
Xtranormal basically animated scripts algorithmically. I'll bet you could write an algorithm that would take certain inputs and create certain animations. I'll bet it would be a lot easier to staff a sweatshop full of entry-level computer operators who don't speak a word of english.
Unfortunately on the Internet a lot of dirt and unnecessary information
Internet is like a sea, full of ideas and discoveries. Right now digital marketing and social media are the main highlights of internet. Among one of the mostly visited websites in the world is youtube and one thing that messes me up is seeing how people tends to bully others who don't agree with their thoughts or vision. My own prime assignment webiste uploads number of videos within a month and those who doesn't like my videos tends to bully me on the internet to shut down my channel and some even threatens me.
i should confess i didnt read it through and it is a bit over my english level but lets take the sleeping bunnies video. apart from the "copyright" concern, i didnt understand what is wrong with it. can someone summarize the problem with that video in a sentence?
The information you share is very useful. It is closely related to my work and has helped me grow. Thank you!
Steps to solve the "file is too large for destination file. Download latest Long Path Tool 5.1.6 free
I see your point. Devil's advocate says that they did start as a search engine, ultimately irresponsible for the things they display: they merely process your query and put out what they've found.
ohhh, always these connection problems(((