Online dating: a huge opportunity if you can make it work
Periodically I survey the internet landscape looking for business opportunities. In particular I look for online services that (a) don’t work very well and (b) can be fixed with better technology.
I haven’t found many. All the obvious online services work about as well as they possibly can. When it comes to search, retail, share trading, maps, video, music, and messaging it’s hard to see how they could be improved. Admittedly I missed the picture sharing niche – now filled by Instagram and Snapchat – but that’s probably because I’m not a teenager!
Online banking is an area where there’s a lot of room for improvement, but the constraint there isn’t technological – it’s mostly regulatory – and the barriers to entry are huge. Maybe Facebook or Google could launch a global banking service that disrupts the industry, but a start-up couldn’t.
No, the only niche I’ve found that fits my criteria is online dating. People spend enormous amounts of time on dating sites, filtering through a lot of noise, and occasionally going on dates with people who, by all accounts, may as well have been selected at random. Many people eventually give up in frustration.
So, what would an ideal dating site look like? Well, it would collect your details. It would analyze them using complex algorithms and match them against everyone else’s details. Then it would return a list of people who are the best you are going to get, and who are guaranteed to find you attractive. Simple! Although I’m not sure about the revenue model …
Why online dating doesn’t work
We’re a long way from achieving this ideal service – assuming such a thing is even possible – so let’s first figure out why the existing services don’t work.
To begin with, let’s assume that everyone can be assigned an attractiveness score from 1 to 10. If all men and all women stood in lines sorted by “attractiveness”, the top 10% would be assigned a 10, the next 10% a 9, and so on. All the tens would go out with tens, the fives with fives, and so on. There are some wrinkles when it comes to differences in age, education, and religion, but basically that’s how the real world works.
Ranking women by attractiveness is fairly straightforward because it’s all about looks. Anyone can do it, just using photos, and will come up with the same rankings, regardless of whether they’re male or female, gay or straight, and irrespective of culture. The ability seems to be hard-wired into the human brain, although interestingly it’s not something computers can do yet.
Where things get difficult is ranking men. Looks (especially height and muscle) count for a bit, but women also want to know whether they’re successful, confident, wealthy, considerate, funny, yada yada yada. You can’t tell those things from a photo, and probably not from a written profile either.
So how do you figure out a guy’s ranking? Well, I can think of three ways …
- Date him. This is time-consuming. If you believe the Pick Up Artist community it takes a woman 7 hours worth of dates to decide if a man is worth sleeping with. However, it is the gold standard, and a successful matchmaking algorithm may have to somehow replicate it.
- Self-selection. One person who knows a guy’s ranking is the guy himself, at least at an instinctive level. Men will usually only approach women of the same level of attractiveness. They won’t go much higher – rejection hurts – and they won’t go much lower – hey, they can do better. I suspect women know this intuitively, and are more likely to consider a guy who has the confidence to approach them.
- Look at the exes. There may be the odd aberration, but if a guy’s exes are mostly threes and fours, he’s probably a three or four. So be careful who you have your arm around in Facebook photos!
So why is this relevant? Well, consider how dating sites work. Men contact the women, guided by their looks, just like in the real world. But there’s no face-to-face rejection. There’s no downside to contacting someone out of your league, and a huge upside. So all the guys do it.
As a result, the attractive women get flooded with requests. But they have no way of sorting the good from the bad because there’s no way to evaluate a guy’s attractiveness. They may resort to some arbitrary selection criteria, but since personality is so important, the criteria are probably no better than a coin toss. So they end up going on dates with guys who are, on average, worse than the self-selected ones they meet face-to-face. Eventually they give up on the online world and go back to the bars, book clubs, or wherever it is that attractive women hang out.
Things aren’t great for the attractive men either. They have no way to stand out from the crowd and line up a date with the attractive women they usually hook up with, so they also give up and go back to the bars, etc. And with the attractive people all leaving, the site goes into a bit of death spiral, propped up only by an infusion of naive fresh meat.
Let’s look at a few possible ways to improve things.
- A first-principles matchmaking algorithm. I have no idea how this would work. Not only would it have to model a guy’s personality, it would have to model it when faced with a particular woman. This probably requires a full brain upload and lots of simulations, which is way beyond current technology.
- Add rejection to online dating. It’s an effective feedback mechanism in the real world, so why not in the online world as well? Well, in the real world, a woman usually has enough information to quickly identify guys who are clearly not in her league and send them on their way. But in the online world she doesn’t, and the rejection signal wouldn’t be much better than random. Rats who receive random electric shocks tend to end up depressed and unhappy, so this is almost certainly a bad idea.
- Include pictures of guys’ exes. In theory I think this would be a brilliant solution, especially if computers eventually get as good at ranking female beauty as people. But in practice there are serious privacy concerns – can you post pictures of exes without their consent? – and a strong incentive to game the system with selective pictures. But honestly, Facebook kind of works this way already, which is why people these days are more likely to exchange Facebook details than a phone number with someone they meet at a bar.
So I have to concede defeat here. I can’t figure out a service that works better than guys approaching women and saying hi. Maybe I should open a bar.
I’ve finalized the design of the augmented reality baseball cap – now called the Matt Hat – and it’s available through a Kickstarter campaign. Check it out!
A few days ago I attended a talk by the winner of the Heritage Health Prize, and it reminded me that I should document the outcome of my team’s efforts.
My team-mate and I competed under the name Planet Melbourne. We submitted entries up until the first milestone at six months and didn’t make any submissions after that.
This apparently made us a useful reference point during the following milestones. Because the milestone rankings didn’t show scores, movements in a team’s ranking relative to our position gave a rough-and-ready indication of the improvement in their score.
Anyway, our rankings were as follows …
Milestone 1 (6 months): 5th
Milestone 2 (12 months): 5th
Milestone 3 (18 months): 14th
Finish (24 months): 25th
After the initial announcement of the winners, there were a whole bunch of disqualifications. Probably from people competing using multiple accounts. As a result, our final official ranking improved to 17th.
I actually got the new head-up display manufactured a few months ago, but haven’t had time to blog about it until now.
Here it is, modelled by my co-worker Scott. Notice the HTC Tatoo held in place with a rubber band. The information appearing on its screen is reflected back off the visor to fill your field of view at a virtual distance of about a metre.
At first glance it looks pretty good, but there are two problems. First, it’s too reflective. I requested aluminium coating on the inside surface at a thickness that would let through 20% of external light, similar to a pair of sunglasses. Unfortunately they coated both sides, so it’s only letting through about 4%. You can sort of see my couch in the background, but it’s faint, so the effect is more virtual reality than augmented reality. Still, that’s easy to fix in the next version.
More serious is the image distortion. I designed the visor using the optics I learned in high school, namely how a parabola can magnify an object and make it appear further away. Well, they lied to me. It turns out parabolic reflectors only work when viewed along the axis, and when they’re viewed off-axis, e.g. from your left and right eyes, the image gets distorted, especially at the edges.
You can see in the picture above how the “22:31″ text is sloping down, and that’s viewed from a camera that was fairly close to the axis. Viewed from your eyes the slope is worse, and, more important, the distortions are different for each eye so the images don’t line up. That makes it impossible to read text.
There’s no good solution to this. I’m writing some ray-tracing software to generate a curved surface that will show a separate image to each eye, but it has drawbacks. Each eye won’t see the entire image, and I suspect it’s going to be fairly sensitive to the position of the eyes relative to the visor.
At least this time I’ve learned a new trick for evaluating a design cheaply. I save the design in STL format, load it into Blender, make it a mirrored surface, and render it with ray tracing. If the reflected checker pattern is undistorted from the two eye positions then I’ve got something that works.
I’ve been having trouble 3D printing the transparent visor needed for my head-up display. The part sent to me by Shapeways was seriously warped and not at all transparent. So I complained about the warping, and they gave me a credit which I spent on printing the visor in white polished plastic.
The white version let me test the form factor (which seems OK), and I’ll glue some reflective film to the inside to check the optics.
If it passes that test I’ll get it manufactured properly. I’ve found a local prototyping firm who can make the visor using CNC and mirror-coat it using vapour deposition. Not cheap – we’re talking several hundred dollars – but it’ll be done properly.
Seeing this video made me realize that quadcopters are more advanced than I thought, which got me thinking about potential applications. One that came to mind was human-computer interactions.
To interact with someone as an equal we expect them to be at eye level (which is a huge problem for people in wheelchairs), and I assume the same will apply if we ever deal with intelligent machines.
In science fiction the usual solution is to put the intelligence in human-sized robots, such as C-3PO and Robbie the Robot. Smaller robots, such as R2-D2 and Wall-E, are generally portrayed as being child-like and inferior.
However, in his Culture novels, science fiction author Iain M Banks has another approach – small robots that float at eye level. These robots, called drones, range in size from hockey pucks up to rubbish bins, and are usually far more intelligent than the humans they deal with. And I suspect we could build a half-decent drone using existing quadcopter technology.
The intelligence behind the ‘copter would be housed remotely, and the only extra features you’d need on-board would be wireless comms, a decent speaker, a camera, a glowing component to show “emotion”, and centimetre-accuracy navigation. Apart from the centimetre accuracy, that’s mostly stuff you’d find in a cheap smartphone. I’d also like it to carry out “nodding” and “head shaking” manoeuvres, but I assume that’s already possible with quadcopters.
I’d be really interested to see how people interact with a talking quadcopter. Would they actually engage as though it were alive, or would they treat it as just another computer, like a flying automatic teller machine? If they do engage, I could imagine quadcopter drones being used as tour guides and customer service reps.
On an unrelated note, does anyone know why the quadcopter is the dominant design? Surely a tri-copter would be just as stable, and cheaper to manufacture?
Update: I don’t why I ask speculative questions when I can just look up Wikipedia. According to this page four rotors make sense because two of them can be counter-rotating, providing more stability. And they give you three axes of rotational motion, so “nodding” and “head shaking” are definitely possible.
It’s always amazing what obscure products you can buy on eBay. It this case, a 10-pack of assorted sandpaper.
Time to start sanding the visor.