Truth and Honesty

Five missed calls from my grandmother. Five. Someone must have died. No one ever has anything that urgent to call me about unless something at my employer broke and I need to drive in to fix it.

I return her call and I’m asked if I did my taxes yet. Of course. I’ve bought a good many things I’ve needed. It’s been a rough year. We’ve even spoiled ourselves for going without for so long. We have deserved it, having struggled through. Unlike my snake of an ex who has her rent paid in full with child support. I’ve had to put blood, sweat, and tears into everything I’ve earned. No free rides. I pay my dues, even if it’s just shoe money for a section 8 recipient.

[Read more]

Tweet Scraper in Python

Code first, talk later.

#!/usr/bin/env python
# encoding: utf-8
 
import tweepy #https://github.com/tweepy/tweepy
import unicodecsv
import sys
 
#Twitter API credentials
consumer_key = ""
consumer_secret = ""
access_key = ""
access_secret = ""
 
 
def get_all_tweets(screen_name):
	#Twitter only allows access to a users most recent 3240 tweets with this method
	
	#authorize twitter, initialize tweepy
	auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
	auth.set_access_token(access_key, access_secret)
	api = tweepy.API(auth)
	
	#initialize a list to hold all the tweepy Tweets
	alltweets = []	
	
	#make initial request for most recent tweets (200 is the maximum allowed count)
	new_tweets = api.user_timeline(screen_name = screen_name,count=200)
	
	#save most recent tweets
	alltweets.extend(new_tweets)
	
	#save the id of the oldest tweet less one
	oldest = alltweets[-1].id - 1
	
	#keep grabbing tweets until there are no tweets left to grab
	while len(new_tweets) > 0:
		print "getting tweets before %s" % (oldest)
		
		#all subsiquent requests use the max_id param to prevent duplicates
		new_tweets = api.user_timeline(screen_name = screen_name,count=200,max_id=oldest)
		
		#save most recent tweets
		alltweets.extend(new_tweets)
		
		#update the id of the oldest tweet less one
		oldest = alltweets[-1].id - 1
		
		print "...%s tweets downloaded so far" % (len(alltweets))
	
	#transform the tweepy tweets into a 2D array that will populate the csv	
	outtweets = [[tweet.id_str, tweet.created_at, tweet.text.encode('utf-8'), tweet.geo, tweet.source] for tweet in alltweets]
	
	#write the csv	
	with open('%s_tweets.csv' % screen_name, 'wb') as f:
		writer = unicodecsv.writer(f)
		writer.writerow(["id","created_at","text","geo","source"])
		writer.writerows(outtweets)
	
	pass
 
 
if __name__ == '__main__':
	#pass in the username of the account you want to download
	get_all_tweets(sys.argv[1])

The entirety of this script doesn’t belong to me at all. My only contribution is fixing utf-8 issues. Requires tweepy and unicodecsv. Outputs tweets in a comma-delimited text file.

[Read more]

Data Collection: SeriousMode

I’ve spent a long time trying to get together a twitter scraper. I haven’t had more than twenty minutes a day, at most, free time to spend doing it. I finally spent some time ripping apart a quick script written in python by someone else, using tweepy. It’s really damned handy. I’ve set it to grab the max allowable tweets from any given user and take data I find important, then format it for storage in a comma-separated value flat file. I spent maybe four hours dicking around with the thing to get it working as seamlessly as possible. Once the “eureka” moment hit, I only spent about twenty minutes to turn data collection on an irritating, but practical, exercise in identity concealment.

[Read more]

Blueberries and Phil Shaltz

Twitter harbors some interesting people. Far more interesting people than most social networks such as Myspace and Facebook, perhaps in part because it’s not as popular as the aforementioned social networks. Individuals such as C1TYofFL1NT and TheLastBand1t are quite attentive to some fascinating goings-on in Flint. They’re more attentive to local politics than any other folks I’ve met.

Specifically, these two gentlemen are keen on blueberries. What does anything important have to do with blueberries? The answer probably won’t surprise you, because your gut instinct is quite correct when you assume nothing at all. That’s right, Phil Shaltz bought a billboard and had it say “I’m concerned about the blueberries.”

[Read more]

Hacks by Hammond

I figured I’d mirror this, just in case. It’s already out there, so the powers that be can’t expect to be able to suppress it.

Sabu also supplied lists of targets that were vulnerable to "zero day
exploits" used to break into systems, including a powerful remote root
vulnerability effecting the popular Plesk software. At his request,
these websites were broken into, their emails and databases were
uploaded to Sabu's FBI server, and the password information and the
location of root backdoors were supplied. These intrusions took place
in January/February of 2012 and affected over 2000 domains, including
numerous foreign government websites in Brazil, Turkey, Syria, Puerto
Rico, Colombia, Nigeria, Iran, Slovenia, Greece, Pakistan, and others.
A few of the compromised websites that I recollect include the
official website of the Governor of Puerto Rico, the Internal Affairs
Division of the Military Police of Brazil, the Official Website of the
Crown Prince of Kuwait, the Tax Department of Turkey, the Iranian
Academic Center for Education and Cultural Research, the Polish
Embassy in the UK, and the Ministry of Electricity of Iraq.

Sabu also infiltrated a group of hackers that had access to hundreds
of Syrian systems including government institutions, banks, and ISPs.
He logged several relevant IRC channels persistently asking for live
access to mail systems and bank transfer details. The FBI took
advantage of hackers who wanted to help support the Syrian people
against the Assad regime, who instead unwittingly provided the U.S.
government access to Syrian systems, undoubtedly supplying useful
intelligence to the military and their buildup for war.

All of this happened under the control and supervision of the FBI and
can be easily confirmed by chat logs the government provided to us
pursuant to the government's discovery obligations in the case against
me. However, the full extent of the FBI's abuses remains hidden.
Because I pled guilty, I do not have access to many documents that
might have been provided to me in advance of trial, such as Sabu's
communications with the FBI. In addition, the majority of the
documents provided to me are under a "protective order" which
insulates this material from public scrutiny. As government
transparency is an issue at the heart of my case, I ask that this
evidence be made public. I believe the documents will show that the
government's actions go way beyond catching hackers and stopping
computer crimes.

Jeremy Hammond
[Read more]

Updates

Added / Updated general site links

  • Reverted back to using rsync because I’m lazy
  • Changed VPS providers because broke / lazy
  • Merged all services to single VPS because broke
  • Still no dedicated network access, working on it though
  • Will be returning to intarbutts shortly to harangue the masses
  • Going to return to blogging about whatever the fuck I want

I think that about covers it. Not changing the style or mechanism of this blog, though. The architecture works in case I want to be really mean and piss off script kids, they’ll have no recourse but to baww.

[Read more]

Step by step GR&R with Minitab

I’m going to go through an incredibly quick rundown of how to conduct your own Gage Repeatability and Reproducibility study using Minitab. The first thing you should keep in mind is that there are ways you “should” conduct a GR&R and there are ways you “shouldn’t”. You “should” be doing a blind, randomized study that doesn’t allow the “operator” to influence the study with what he expects. This isn’t always the easiest way to begin your first study or introduce yourself to these types of studies. For your first study, just do your trials in order.

[Read more]