# Trader Interview Questions

# 5K

Trader interview questions shared by candidates### If two cars are traveling in a two lap race on a track of any length, one going 60 mph and the other going 30mph, how fast will the slower car have to go to finish at the same car to finish at the same time?

30 Answers↳

It's impossible, the faster car will be done the race by the time the slower car finishes the first lap. Less

↳

The way the question is currently worded, it does not indicate any of the following: 1. Whether the two cars started at the same place, at the same time (we can infer "same place, same time" because it is a race), 2. Whether either car has traveled any distance at all (if yes, then how far; if the slower car has traveled one lap, then the faster car has finished, and if no, then the answer is 60 mph), 3. What is the shape of the track (to Alanjai's point, a regular track requires offset starting positions, whereas a figure-8 track with fixed lanes would not), and finally 4. Why the question is worded so poorly ("to finish at the same car to finish at the same time" ... I mean, come on, that's practically not even literate). Less

↳

The two pieces of missing info are: 1. How long is the distance of the track and 2. The distance that each of the cars has already traveled on the track. If you have that info then you can figure it out. Less

### 37 times 37

24 Answers↳

Two ways: 37 * 37 = (40-3) * (40-3) = 40*40 - 6*40 + 9 = 1369 37 * 37 = (35+2)*(35+2) = 35*35 + 4*35 + 4 Any two digit number ending in 5 can be squared by taking tens digit, multiplying it by itself plus one, and tacking on a '25' at the end, so 35*35 = 1225 so 1225 + 140 +4 = 1369 Less

↳

This can be answered in well under 30 seconds by a simple application of quadratics. 37^2 = (30 + 7)*(30 + 7) = 30^2 + 7^2 + (2*30*7) = 900 + 49 + 420 = 1369 Less

↳

There is a mathematical "shortcut" for squaring any two digit number that would probably be the easiest to use: 1) Square the individual digits and place the results next to each other: 3^2=9 and 7^2=49 => 949 2) Multiply the two digits, double the result, and add a zero: 3*7=21 2*21=42 10*42=420 3) Add the two numbers: 949+420=1,369 Less

### You bid for a coin. You're confident that the price of the coin is between 0 and 100, if your bid is greater than the price, you win and sell it to your friend at the price of 1.5 times price. what's your bid to max your profit?

16 Answers↳

I loved this question and want to renew this debate. What do you guys think about my two approaches to solve it: 1) If we can only play this game once AND our goal is to maximize profit (as the question states). I agree with above that expected value of a coin is 50. Given that we bid 51 to win auction and pocket 24. Problem is we only win if coin is (0:50) which gives us new expected value of 25, and so we lose. We can deduct this way all the way to zero bid. 2) Nothing beats little Monte Carlo experiment. I created a matrix of 100X1000000. Where 100 is the number of possible bids given certain price. 1M is the number of random uniformly distributed prices 0-100. Calculated expected gain at each bid level 0 to 100. I wish I could post a MATLAB graph here. It looks as downward facing 1/2 of parabola with max value of 0 and min of -25. Results: best gain of 0 achieved at 0 bid, worst average gain of -25 is at 100 bid. Comments appreciated! Less

↳

i had the longest argument with a friend on this. you cannot get a positive expected value no matter what you bet. if you bid $50, then you can discount the EVs if the value of the coin is 51-100 since that'll be 0 (you don't win the auction). if you bid $50 and the coin's worth $50, you sell for $75make $25. but if the coin's worth $0 you lose $50. keep comparing the extremities and you will see in almost all cases you will be losing more than you make...that's the best i can explain it. i had to use a spreadsheet to prove this to my friend. in order to get an EV of 0, you'd need to change the multiplier to 2. which makes sense. if X is your bid, your profit is (X/2) *1.5 - X. Less

↳

I don't think the argument for avg of 50 => bid 75 is accurate. You can't simply use the expected value of theprice to find its area of the profit curve (by integration, the area under the profit curve gives the expected profit. When you can do however, is find the mean of the expected profit, but you will end up with the same answer that the expected profit is negative everywhere except for 0. Less

### What is the sum of the digits of all the numbers from 1 to 1000000? This is different from the sum of the numbers. For instance the sum of the numbers from 1 to 10 is 55 whereas the sum of the digits is 46.

16 Answers↳

The main idea is that if you write all the numbers from 0 to 999999 down as six digit numbers (possibly prepending zeros) then all digits appear the same number of times. So, its digit appears exactly 6 x 1000000/10 = 600000 times. so the result is 600000x 45 +1 (+1 for the number 1000000) Less

↳

27,000,001 is what I got. Think of each number as a 6 digit number. The average number each digit could be from 000,000 to 999,999 is (9+0)/2=4.5. Since the average of each number is 4.5 and there are 6 digits the average sum of the digits for a 6 digit number should be 4.5*6=27. There are 1 million numbers from 000,000 to 999,999 so the sum of the digits from 000,000 to 999,999 is 27,000,000. Subtract the digits of 000,000 which is just 0 and add the digits of 1,000,000 which is just 1 to get 27,000,001. Less

↳

scott, dude you should add digits not the numbers, so 99+2 = 18+2 =20. not 101

### Here is an example of a brainteaser during the interview: You have five pirates, ranked from 5 to 1 in descending order. The top pirate has the right to propose how 100 gold coins should be divided among them. But the others get to vote on his plan, and if fewer than half agree with him, he gets killed. How will the coins end up being divided, assuming all the pirates are rational and want to end up alive?

15 Answers↳

This is just a classic game theory question and you have to work backwards through it starting with the base case to understand the pirates motivation. If you have two pirates, p1 and p2, and if p1 is the final voter then he will always vote negatively unless p2 gives him all $100 since he can kill him anyways and take the whole loot. This means p2 is in a compromised position and does not want the game to go down to 2 pirates and will take any value greater than 0 from any other pirate, or will vote yes if he receives at least $1. When p3 is introduced, he knows p2 will need at least $1 to vote for the plan therefore he keeps $99 and gives away the last dollar to p2. This means p3 is in a dominant position and will vote down any plan that grants him less than $99. When p4 is introduced then he needs two of the three voters to vote for his plan. Granted p1 will decline unless he receives all of it and p3 will decline unless he receives at least $99 of it then he will give p3 exactly that and p2 $1 otherwise he is killed. p4 is in a compromised position so he will accept any offer where he receives something greater than 0. When p5 is introduced he knows p4 and p2 are screwed and the maximum they can earn if it bypasses him is $1. Therefore granting them each that money will guarantee their vote leaving the remaining $98 for himself and half the votes are positive thus he is not killed and gets to keep $98. So the distribution for p5, p4, p3, p2, p1 should be 98, 1, 0, 1, 0. Less

↳

the answer isnt that obviouse.....you give the 3rd and 1st one coin and keep 98....start at the beginning. if theres 2 pirates, 100 for 2, 0 for 1 (one doesnt like this) If there is three 1 will be happy with just a single coin, b/c he does not want it to go down to 2. If there is 4 pirates, 2 will be happy with a single coin, b/c he does not want it to get down to 3 pirates where he will receive 0. So he gets 1 and 4 gets 99. At 5 it changes a bit. Here 1 and 3 will be happy with single coins b/c if it goes down to 4 they will receive 0 coins. So 5 takes 98, and 1 and 3 take 1 one each Less

↳

sean is not right. The top pirate DOES NOT vote. If there are two pirates and the top decides to take the 100 coins for himself, the other one will vote against and the top pirate will be killed Less

### Si trois personnes sont dans une pièce, quelle est la probabilité qu'au moins deux personnes soient nées le même jour de la semaine?

15 Answers↳

La probabilité que les trois personnes soient nées un jour différents est de 6/7*5/7=30/49 La probabilité qu'au moins deux personnes soient nées le même jour est donc de 1-30/49=19/49 Less

↳

1 - P (personne ne soit né le même jour de la semaine) = 1 - (7/7 * 6/7 * 5/7) = 0.39 Less

↳

Je trouve qu'aucune des réponses n'est claire, même si certaines sont justes (19/49). Calculons 1 - la proba qu'ils soient tous nés un jour différent : Si on veut tous les cas possibles de naissances, même jour ou non, pour 3 personnes, le premier peut naître un des 7 jours, le deuxième aussi, le troisième aussi, on a donc 7*7*7 pour répertorier tous les "ordres" de naissances possibles. Ce sera le dénominateur. On veut maintenant le numérateur. Pour que chacun soit né un jour différent, le premier peut naître n'importe quel jour de la semaine, il a donc 7 possibilités. Le deuxième peut naître n'importe quel autre jour de la semaine, il lui reste donc 6 possibilités. Le troisième peut naître n'importe quel autre jour de la semaine, il n'a donc plus que 5 possibilités. 7*6*5 (on appelle ça un arrangement de 3 parmi 7). La proba qu'ils soient tous nés un jour différent est donc de (7*6*5)/(7*7*7) = 210/343 La proba qu'au moins deux soient nés un jour différent est donc de 1-210/343 = 133/343 (ce qui est égal à 19/49) Less

### Flip a coin until either HHT or HTT appears. Is one more likely to appear first? If so, which one and with what probability?

15 Answers↳

HHT is more likely to appear first than HTT. The probability of HHT appearing first is 2/3 and thus the probability of HTT appearing first is 1/3. Indeed, both sequences need H first. Once H appeared, probability of HHT is 1/2 (b/c all you need is one H), and probability of HTT is 1/4 (b/c you need TT). Thus HHT is twice is likely to appear first. So, if the probability that HTT appears first is x, then the probability that HHT appears first is 2x. Since these are disjoint and together exhaust the whole probability space, x+2x=1. Therefore x=1/3. Less

↳

Let A be the event that HTT comes before HHT. P{A} = P{A|H}P{H} + P{A|T}P{T} = .5P{A|H} + .5P{A|T} P{A|T} = P{A} therefore, P{A|H} = P{A|T} P{A|H} = P{A|HH}P{H} + P{A|HT}P{T} = (0)(.5) + P{A|HT}(.5) Therefore, 2P{A|H} = P{A|HT} P{A|HT} = P{A|HTT}P{T} + P{A|HTH}P{H} = (1)(.5) + P{A|H}(.5) 2P{A|H} = .5 + P{A|H}(.5) P{A|H} = 1/3 and P{A|H} = P{A}, therefore, P{A} = 1/3 So, HHT is more likely to appear first and it appears first 2/3 of the time. Less

↳

Above link is the best solution I have seen for this problem http://dicedcoins.wordpress.com/2012/07/19/flip-hhh-before-htt/ Less

### Two people each bids a number before throwing a 30 faced die. Whoever gets closer to the number wins and wins the amount of money equal to the number they throw. e.g I bid 15 and you bid 16. the die lands on 10 then i win 10 from you. What's the best strategy and the expected payoff.

14 Answers↳

We choose 22. If our opponent plays optimally, he chooses 21.Let us see why. Clearly, he is facing a choice between 21 and 23. If he picks 23, his expexted payoff is 8/30 * 53/2 - 22/30 * 23/2 = -41/30. If he goes for 21, his expected payoff is 7/10 * 11 - 3/10 * 26 = -0.1. Note that in both cases our opponent on average loses, so he will choose 21 to minimize his loss. In this case our expected profit is his expected loss, ie on average we expect to make 0.1 per game. Less

↳

What was your approach to the problem?

↳

I think from question that first you bet, and afterwards your opponent. It's quite obvious that when you bet on one number, for example a, your opponent will bet either on a-1, or a+1, otherwise won't be optimal for him (betting on 1 or 30 isn't very smart). So for every number you bet on, you have two expected gains (depending whether your opponent chooses a-1or a+1). Because your opponent plays optimally, you are looking for number with the highest lower gain of two above-mentioned. If you choose 21, your expected gains are 231/30 and 255/30, if you choose 22 then you will have expected gains 253/30 and 234/30, so therefore choosing 22 is optimal, and your opponent chooses 21. Less

### There are 3 coins. One coin has heads on both sides, one coin has tails on both sides, the third one has head on one side and tail on the other side. Now I pick up one coin and toss. I get head. What is the chance that the coin I picked has heads on both sides?

14 Answers↳

2/3

↳

Because you have 1/3 chance to get double head coin and you will surely get head, 1/3 chance to get single head coin and then 1/2 chance to get head. So the probability of choosing double head coin and get head is 1/3, while choosing single head coin and get head is 1/6. Then, given you get head after tossing, then chance that you chose double head coin is (1/3)/(1/3+1/6) = 2/3 Less

↳

2 heads on double headed coin, 1 head on the other, P(head is coming from double headed) = 2/3 Less

### In world series (baseball) there are two teams, A and B. You know that each can win 50% of the time (1:1 odds). You also know how the game works, i.e. Whoever wins 4 games first wins. What is the probability of getting to game 7 (i.e. Each team wins 3 games)?

14 Answers↳

Assuming no tie. It's the prob of throwing 6 coins and get 3 head and 3 tails, which is (6!/3!3!)/2^6 = 20/2^6 = 5/16 Less

↳

to clear up any misconceptions, since there seem to be multiple answers here. 5/16 is the correct result Less

↳

Whoever wins the first game does not matter, as it could be either team with a lead after 1 game, so it's the next 5 games that matter. There are 2^5 or 32 possible outcomes for those games. The outcomes for the team that won game 1 that will force a game 7 are WWLLL, WLWLL, WLLWL, WLLLW, LWWLL, LWLWL, LWLLW, LLWWL, LLWLW, LLLWW. This is 10/32 or 5/16. Less