Category Archives: Technology

Is Facebook Fingerprinting Chrome extensions?

This morning I noticed something new in my Chrome Console while working on a Chrome Extension. It looks like Facebook is now looking to see if you have a set of Chrome extensions installed in your browser. Most of the extensions I looked up on Google’s web store and via their search engine and they look like they are Malware of some sort, but a few of them look like they are much less harmless. It’s hard to see what the extensions actually do because they have been pulled from the Chrome Web Store, but some of these look like they modified the appearance of Facebook intentionally.

Some users don’t like being forced to see walmart colors all over the web and have used various tactics to customize the web to their liking. Some other users might do it for usability reasons or just plain augmentation of the web.

Does Facebook have the right to do this? It feels like an invasion of my privacy. I think the latest version of Chrome protects us from this sort of attack but that does not mean that Facebook won’t invest in other ways to discover this information or that they won’t lobby google to discover it. Some extensions also have the ability to open up channels to other extensions so if facebook had it’s own extension it might still try to fingerprint which extensions are there.

This isn’t all bad, in fact it really depends on how it’s being used. If it’s only used to defend our privacy and security then it seems fine, but if this little trick is being abused this could really sour things.

I guess I am mostly just surprised that Facebook is doing this.

Screen Shot 2012-09-05 at 10.30.49 AM

To get this to occur in your own browser you will need to be a bit sneaky and use a private session. Once Facebook runs the finger print they don’t do it all of the time, I think they only do it on auth and then they cache the result in some way. In fact since some extensions have access to the cookies I wonder if one could ‘skip’ the finger printing by setting the proper cookies or localstorage setting.

Finger Printed Extension IDs

kjafndplmofcedgddaoceenkcbfankem
kincjchfokkeneeofpeefomkikfkiedl
iejbljbhhoclgfiapmomcpkpkcmihfib
lkfhadffdnjnogmgjfihlcmmjhcfchaj
afnnkheojlgkipfabekmdkngocfhegck
hkpibllecmidllaojdmkcmfnoinmejco
gpllafflnmgjjcakjloknldkndnkmcpi
pkhidkonipdjidjglnkfcfhnkfnlefbk
kjafndplmofcedgddaoceenkcbfankem
kincjchfokkeneeofpeefomkikfkiedl
iejbljbhhoclgfiapmomcpkpkcmihfib
lkfhadffdnjnogmgjfihlcmmjhcfchaj
afnnkheojlgkipfabekmdkngocfhegck
hkpibllecmidllaojdmkcmfnoinmejco
gpllafflnmgjjcakjloknldkndnkmcpi

If not a MacBook Pro, what Laptop should I buy?

I did this research for a friend, so I thought I would share with everyone else.

The HP laptops have the highest performance to cost ratio with that said there are a few sacrifices for each model.

In your shopping you will want an Intel i5, Intel i7 or one of the AMD A6 or A8 processors. Do not get an Intel i3 processor. The Intel i5 or i7 should be fairly no fuss and I have enjoyed them. It’s the same processor found in the new Macs. The A6/A8 processors from AMD are really new and fewer computers have them, however they have a graphics card built right into the processor and in theory offer really low power operation, although this is the more darring choice.

For price performance, i’d probably choose an i5 or the AMD A8.

Here are list of graphics cards and their frame rates.

The HP models found below are really good, but you will need to choose the upgraded graphics processor. These are the fastest graphics processors in a Laptop in your price range. I also have one of these notebooks if you would like to see how it feels. We could even try your software on it to see how it runs.

HP High Performance Laptops

The other option is the Sony S series Viao. While it’s graphics card is not as fast as the HP, and the Sony has a semi-gloss-anti glare screen while the HP can be ordered with a matte screen. The Sony S series also have a really neet battery system and can last up to 15 hours with the extra battery ( *according to the site ), however, it’s probably more like 7 or 8hours with wifi on. The Sony’s should also feel better or feel more like a Mac.

Sony S series, stylish performance

I would also recommend that you upgrade to Windows 7 professional rather than just sticking with the Windows 7 home version. The professional version allows you to run software in a Windows XP mode and is more compatible with older software. I have used this mode on at least 3 different applications to get them to work.

The HP will be the fastest, but the Sony will feel nicer. HP also has a nice warranty program that can be upgraded over time, and they are the largest PC manufacturer. With the HP warranty I bought the 1 year plan and at the end of the year you have the option to extend it. I do have to say that even though the HP is very fast and it was a great bargin, it does feel quite different than working on a Mac, which is what I am used to.

I loaded up my HP and got i7 Quad core with 2GB of Graphics memory and 8GB of system memory, and a bluray drive to watch movies on planes. ( the Sony S, also can play bluray )

If the timing was a bit different, I might want to sell mine and get a Mac Book Pro 15 inch instead.

Let me know what you decide.

JSLint-Feature – Error Severity

Hi Douglas,
I have been to a good number of your talks, and I love the idea behind JSLint. I even don’t mind it when JSLint makes me cry every now and then, however, I feel like the priority of JSLint should be around launch successful code early and often.

Here at Kabam we build games that run as HMTL5 apps, and recently we have started to use JSLint as part of our build process. In a few cases it has caught some minor errors and we were later able to resolve them.

Our build process uses Jenkins, which is an open source version of Hanson. These continuous build automation tools make use of another wrapper library you might be familiar with. http://code.google.com/p/jslint4java/

So in this sense let’s say a develop wants to add a new piece of code, and this code causes a problem.

JSLint reacts in the following ways,
– it has a an error limit and stops reporting erros after a certain amount
– errors are not prioritized based on ‘newness’ – because that would be hard
– errors are not prioritized based on priority.

The end result is that a team must fix all errors to get the maximum value out of JSLint.

I propose a 2 part solution.
1. JSLint should not set error priority, unless it sets sensible defaults
2. JSLint should report an error code so that a wrapper like jslint4java could use a config file set an error priority.

This would allow teams to triage error types as part of their build/ development process.

On line 1326 you have a function warn() defined I propose it’s written something like so, including a new warning property, defining an error.type property using the bundle[] accessor name as the error code.

If you think of some other change that could get to the spirit of what I am looking for that would be great too. If I get the time I might try to implement a prototype. Hudson and Jenkins have 3 levels of errors, High, Medium, and Low.

Would you have some suggestions as to what would make good defaults for each of the error.types?

I feel like this change would better support agile development and continuous integration, and would allow teams to prioritize their development efforts.

Thank you for being a beacon and leader in the community.

function warn(message, offender, a, b, c, d) {

        var character, line, warning;

        offender  = offender || next_token;  // `~
        line      = offender.line || 0;
        character = offender.from || 0;

        warning = {
            id: '(error)',
            raw: bundle[message] || message,
            type: message,
            evidence: lines[line - 1] || '',
            line: line,
            character: character,
            a: a || (offender.id === '(number)'
                ? String(offender.number)
                : offender.string),
            b: b,
            c: c,
            d: d
        };

        warning.reason = warning.raw.supplant(warning);

        JSLINT.errors.push(warning);

        if (option.passfail) {
            quit(bundle.stopping, line, character);
        }

        warnings += 1;

        if (warnings >= option.maxerr) {
            quit(bundle.too_many, line, character);
        }

        return warning;
    }

Netflix, Rushing Roulette?

Like many of us you might be wondering why Reed Hastings is tearing a great company in two. The company had had a solid decade of growth and now it seems to be faltering at every step. In the past decade Netflix has gone from ~$3 a share to ~$300 a share a 100x return, so investing $100 in the company would have made you $10,000, not bad. In fact that return is so good that you would find it hard to find a bet in Vegas that would give you a return like that. Roulette only nets a 35x return if you bet on a single number, and in Craps you can make an unlikely bet to earn 30x, by betting either on a 2 or a 12, both requiring a pair of doubles to come up.

In the case of Netflix though there are a number of pressures bearing down on them. Content providers see Netflix’s 25 million customers and feel like DVD/ BluRays are more secure and feel like they can turn content on and off to A/B test how consumers view content. The US Postal Service is thinking about cutting 238 offices nation wide, cutting out up to 5 post offices per state, or more likely up to 10 offices per metro region. Netflix has had to adjust content pricing to afford them the ability to bring in new content, while, other long standing partners have decided to take their content libraries elsewhere. And then add to that that there are Trillions of dollars of investment, yes Trillions, invested in the group of company’s competing with Netflix over the content war brewing in your living room and you might start to see why Netflix is being brash about this.

As a Netflix subsriber, I have been more than happy to have items sitting both in my streaming que, and in my dvd que, and honestly if everything on Netflix was available via streaming or download, I would opt for that format. The kicker however is that so much good content is left on disc for one reason or another. I have seen content appear on various outlets, like Netflix, XBox Live, Hulu, and Amazon to name a few, but rarely is all of the content in one place. This is probably why Google feels the need to provide a TV with a search bar, and why Apple has been trying to figure out how to own the channel as well. However in the end, the customer still looses. I think in this case, right or wrong Netflix is just tired of talking to content providers about what format the content will be available. I think the re-brand is more of a point of discussion in contracts, and negotiations than it is a consumer feature.

The next chamber of risk brings us to the pending demise of the post office. Currently the US government has been looking for ways to trim the budget and the US Postal Service has been operating at a loss for as long as I can remember, however, today they are preparing for being on their own. There are a few bills in the House and Senate that might require the Post Office to be profitable, and so the Post office has proposed a plan that will cut 7% ( 38,000 ) of it’s workers, and 1% ( 238 ) of it’s offices from it’s budget just before Christmas. I think Netflix, which is one of the largest users of the post office next to Amazon is probably concerned that the cost to ship DVDs this winter will increase, and the delivery times will too. Right now I get my DVDs in about 1 day, sometimes 2. Imagine now that those discs now take 2 days or maybe 3. How much more pressure will there be to stream. The odd part is that all of the disc rental sites will have the same problem. And for a 2hour film waiting 2 days for a film, and the having to wait another 2 days for it to get back to Netflix/Qwikster might cut the number of DVDs you have cycling through by half. I would expect Qwikster to play with pricing over the next year to account for that. Splitting the two companies means that while Qwikster finds it’s groove Netflix can focus on getting content.

On top of all that, Netflix is a 12 Billion, strike that 8 Billion dollar company that is competing with Google, Apple, Microsoft, Amazon, Comcast, Time Warner Fox, NBC, ShowTime, HBO, TNT, Viacom, Disney, etc… I think the market cap of their competition is near 2 Trillion dollars. Netflix needs to be more nimble than is competitors and if by doing this they can even gain a month or two on these companies it might be worth it. iPad beat several company’s by only 6 months, but by that time they had already sold enough devices to make it very expensive for other’s to compete there.

Finally companies like to manage their investors expectations and if the news was already bad, they have this habit of pilling on bad news or announcing bad news right before really good news. So if Netflix stock is already taking a hit, and you were going to separate the companies in q4, but now you can just do it in q3, you can write those numbers off earlier, you can push your bad books down into q3, and then you can set your self up for a decent q4. If Netflix wants to compare q4 of this year to q4 of next year, doing this now helps. I also thing that Netflix’s gross margins will skyrocket right now Netflix has a P/E of 40 at $150 a share, and it probably was a P/E of 80 at $300 a share which was probably a bit over priced with a gross margin of 15%, but let’s say that Netflix drops it’s DVD rental service, and it’s gross margins go from 15% to 50%? Even if the company looses customers it probably is better for the stock, and the company. The company would be cash rich and able to better acquire content. Right now Qwikster is expensive to run and competes on costs, and is probably constrained to 10-20% gross margins.

Netflix on the other hand has been experimenting with buying their own custom content and this may allow the company to do that more. I would love to know what the insider trading is doing with Netflix.

Using Cellular Atomata as a Design Principal for ‘Natural Design’

I read this really thought provoking article using Cellular Atomata, prime numbers, and layered images to create seemingly random backgrounds for web pages. Yes, backgrounds for web pages.

The problem is that on so many web pages, load time is an issue and so is screen size, so it’s really hard to avoid those lines or patterns that seem to repeat and distract from the content on a variety of devices.

It’s actually a great design principal to accommodate randomness, so the eye can focus on the structured content. It’s kinda like having a great forest in the background to an architectural monument. If done right the monument and surrounding garden will really pop against the scene.

In any case the author, Alex Walker, discusses how to uses geometric patterns based on prime numbers to create a sort of randomness in the background. The examples he gives are a much more understandable explanation than say Stephan Wolfram gives about the value of cellular automata in creating natural systems.

I think this better illustrates what Stephan Wolfram was talking about with cellular automata.

It’s clearly brilliant to use it as a graphic design principle.

Maybe Cellular Atomata (CA) rules could be used to generate layout width & height rather than typical asymmetrical patterns? An app like Flipbook could benefit from it, to keep the page layouts feeling designed and fresh.

I think Alex is on to something here using CA (Cellular Atomata) as a design pattern.

I would love to see it in designing UX and Architecture. I would love to see CA being used as a way to generate dynamic, but structured design.

It’s brilliant. Imagine what Santiago Calatrava could do with such a pattern.

Paypal IPN Validation Fails with Adaptive Payments and PHP query/ post parameters

If you are looking for a solution you can find it here.
Thank you Gleb ( http://www.memberwing.com/ )

https://www.x.com/message/158509#158509

The problem comes in how the API is designed, and it takes advantage of a little known feature of query parameters and their allowed characters. Paypal uses array’ed parameters like:

&transaction[0].status=value

The problem is that PHP does not know how to parse the query parameter and either skips it or stops processing the list. ( i can’t remember which ).

Paypal’s Adaptive Payments API is neat and freshens up their functionality, and additionally uses JSON as a communication layer, so I think it’s clearly their future, however, there are a number of little problems like this as you walk through getting up to speed on the API. I hope this helps anyone in the future by saving them an hour to day.

It kinda reminds me of some of the problems we had getting the MySpaceID API up and polished so I guess this is a nod to all of those APIs that do it right the first time.

Cheers.

Getting VerifyStatus API working in Sandbox

There are a few caviots to getting the API working.
I hope this saves someone a few hours/ days.

  • CallerServices.php has a small bug
  • Only Sandbox email accounts work in the sandbox. Thanks for confiming @ppalavilli
  • ALL Sandbox accounts have the First Name: Test Thanks @ppalavilli
  • ALL Sandbox accounts have the Last Name: User Thanks @ppalavilli

If you get a

PHP Warning: Missing argument 3 for CallerServices::callWebService()

On line 101 of CallerServices.php you have:

function callWebService($request,$serviceName,$simpleXML)

It should read: //(most of the calls that use callWebService are parent::callWebService( $request,$serviceName ) anyways.

function callWebService($request,$serviceName,$simpleXML=NULL)
{
$response = null;
try {
    $endpoint=API_BASE_ENDPOINT.$serviceName;
    $response = call($request, $endpoint, $this->sandBoxEmailAddress,$simpleXML);
}
catch(Exception $ex) {
throw new FatalException('Error occurred in call method');
}
   return $response;
}

Here is the code I used. Replace {email} with one of your sandbox email addresses.

public function verify_email($params){
$VstatusRequest = new GetVerifiedStatusRequest();
 
$VstatusRequest->emailAddress = '{email}';
$VstatusRequest->matchCriteria = 'NAME';
$VstatusRequest->firstName = 'Test';
$VstatusRequest->lastName = 'User';
 
$rEnvelope = new RequestEnvelope();
$rEnvelope->errorLanguage = "en_US";
$VstatusRequest->requestEnvelope = $rEnvelope ;
 
$aa = new AdaptiveAccounts();
$response = $aa->GetVerifiedStatus($VstatusRequest);
 
echo json_encode($response);
  }

The JSON encoded object looks like such:

{"responseEnvelope":{
    "timestamp":"2011-03-25T15:37:32.44307:00",
    "ack":"Success",
    "correlationId":"42bce847aebc9",
    "build":"1772158"
},
"accountStatus":"VERIFIED"
}

@jdavid

Paypal X Adaptive Pay FundingConstraint for PHP

The following code is an example of adding the FundingConstraint objects for a payRequest with Paypals new Adaptive Payments. I hope this helps.

$payRequest->fundingConstraint = new FundingConstraint();

//$payRequest
//    ->fundingConstraint
//    ->allowedFundingType
//    ->fundingTypeInfo
//    ->fundingType = "BALANCE";

$payRequest
    ->fundingConstraint
    ->allowedFundingType
    ->fundingTypeInfo
    ->fundingType = "ECHECK";

tumblr

I just started a tumblr account. As I test it out I may push more and more content there. It seems that tumblr has a crazy growth rate, and really provides some advantages for sharing ideas. Maybe wordpress just is not social enough to keep up.

If you have a tumblr account let me know, my tumblr account is http://jdavidnet.tumblr.com

Vid.ly Beta IE size bug work around

        var n = document.getElementById('vidly-wrapper');
        if( n != null && vidjs.prototype.isIe() ){
            vidjs.prototype.flashMarkup =
                        '<object classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000" width="{width}" height="{height}">' +
                        '<param name="movie" value="http://vidly.dev.jentek.net/templates/js/player.swf"/>' +
                        '<param name="FlashVars" value="src={file}&skin=http://vidly.dev.jentek.net/skin.xml&scaleMode=none"><param name="AllowFullscreen" value="true"/>' +
                        '<param name="allowscriptaccess" value="always"></param>'+
                            '</object>';
           vidjs.prototype.flashMarkup = vidjs.prototype.flashMarkup.replace("{width}", '610');
           vidjs.prototype.flashMarkup = vidjs.prototype.flashMarkup.replace("{height}", '360');
       
           n.parentNode.removeChild(n);
           new vidjs();
       }