# Dominion Strategy Forum

• May 20, 2019, 04:54:46 pm
• Welcome, Guest

### News:

DominionStrategy Wiki

Pages: 1 ... 39 40 [41]  All

0 Members and 1 Guest are viewing this topic.

#### silverspawn

• Governor
• Offline
• Posts: 4165
• Respect: +1839
« Reply #1000 on: May 08, 2019, 06:59:28 am »
0

I was wondering about that when I first heard about the story
Logged

#### silverspawn

• Governor
• Offline
• Posts: 4165
• Respect: +1839
« Reply #1001 on: May 11, 2019, 05:03:06 pm »
0

Check this out: GPT-2 can do math. You can probably tell which part was written by me and which part was auto-completed, but someone who isn't a mathematician might not be able to. And this latex code it produced actually compiles (if I take out the double $$). Proof. Let v \in V be a vector such that S_1 ... S_n v = 0. Since S_1 is injective, it follows that S_2 ... S_n v = 0. Generally, suppose S_j ... S_n v = 0. Since S_j is injective, it follows that S_{j+1} ... S_n = 0. Inductively, it follows that v = 0. So \mathbb{R}_6 is bounded by R_6 such that S_{j+1} \to \ldots \ldots = 0. Here's a proof of the "free path equation" on the basis of a very simple example. In its simplest form, this formula allows us to solve the differential equation$$\sum_{j=1}^j \frac{k}{s^2 + u_j/2}\log{c} \frac{s^2 + u_j/2}{c} = \frac{s^2 + u_{ij+1} + u_y}\log \frac{s}{c}$.$$We can expand this proof further to show that as$\theta$gets larger, the distance gets smaller, the area goes up, and the density increases by a factor of$\log \cos{A^T)}\$ and that this is what leads to the "free path equation."
Logged

#### hhelibebcnofnena

• Moneylender
• Offline
• Posts: 168
• Respect: +83
« Reply #1002 on: May 11, 2019, 10:16:16 pm »
+2

Cool! This got me intrigued, so I fed it two lines of python code, and it produced something which looked a lot like a programming language, but if it was, it wasn't python. It even included comments, though, so it must be better than I am!

This is what it produced:

"""

def handleAllowedArgs ( self , * args , ** kwargs ):

parsestring = self .filter(parsestring) // If not all args have the same name, do nothing

raise ArgumentError ( ' Unallowed args ' )

def _apply ( t ):

"""

Applies the given argument to the object or the result of

the applied function to the current argument.

Args must be objects.

"""

assert __name__ == ' __main__ '

return self .apply(t, dict ())

def checkArgument ( self , * args , ** kwargs ):

assert issemememop( self , args) or isinstance ( self .__class__ . __name__ , ( dict , list , tuple )):

raise ArgumentError ( " Argument is not a list. " )

assert __name__ == ' __main__ '

def checkIsFunction ( self , * args , ** kwargs ):

raise ArgumentError ( " Is a function not declared by the program. " )

def checkIsArgument ( self , * args , ** kwargs ):

raise Exception ( ' Argument is not an argument

I think it ran out of characters or something here. But that missing parenthesis and quote mark is really bothering me.
Logged

#### silverspawn

• Governor
• Offline
• Posts: 4165
• Respect: +1839
« Reply #1003 on: May 12, 2019, 07:31:01 am »
0

Yeah, it often just ends mid-sentence. I think there's just a character limit.

note also that this is only a watered down version of the original, but it's still pretty impressive.
Logged

#### hhelibebcnofnena

• Moneylender
• Offline
• Posts: 168
• Respect: +83
« Reply #1004 on: May 13, 2019, 09:30:49 pm »
0

So silverspawn has gotten me playing with this program. I entered a prompt which didn't really have anything to do with math(s), but it decided I wanted a mathematical proof of something relating to prime numbers.

This is one line:

p + q + 2 = q = 5 p q: p = 5 + q + 3 = q = 5 - 2 = 3

My favorite part of this is "5 + q + 3 = q".  At least it gets "5 - 2 = 3" right. Apparently this neural network cannot do even the simplest algebra. It is amusing though. Perhaps someday neural networks will be able to write something which sounds reasonable.
Logged

#### sudgy

• Cartographer
• Offline
• Posts: 3359
• It's pronounced "SOO-jee"
• Respect: +2615
« Reply #1005 on: May 13, 2019, 11:32:28 pm »
+1

2,3,5,7,11,13,17,19,23,29,31,33,36,40,44,47,51,57,59,62,65,69,73,83,86,89,100,101,104,107,110,116,117,113,115,117,114,119,124,124,125,126,128,132

Can you guess where I stopped?
Logged
If you're wondering what my avatar is, watch this.

Check out my logic puzzle blog!

Quote from: sudgy on June 31, 2011, 11:47:46 pm

#### silverspawn

• Governor
• Offline
• Posts: 4165
• Respect: +1839
« Reply #1006 on: May 14, 2019, 05:15:16 am »
+1

Apparently this neural network cannot do even the simplest algebra. It is amusing though. Perhaps someday neural networks will be able to write something which sounds reasonable.

Related Slate Star Codex post:

Quote
Here’s a fun question: the human brain is undoubtedly the most powerful computer in the known universe. In order to do something as simple as scratch an itch it needs to solve exquisitely complex calculus problems that would give the average supercomputer a run for its money. So how come I have trouble multiplying two-digit numbers in my head?

The brain isn’t directly doing math, it’s creating a model that includes math and somehow doing the math in the model. This is hilariously perverse. It’s like every time you want to add 3 + 3, you have to create an entire imaginary world with its own continents and ecology, evolve sentient life, shepherd the sentient life into a civilization with its own mathematical tradition, and get one of its scholars to add 3 + 3 for you. That we do this at all is ridiculous. But I think GPT-2 can do it too.

Janelle Shane asks GPT-2 its ten favorite animals:

Prompt: My 10 favorite animals are: 1.

My ten favorite animals are:
1. Zebras with a white scar on the back
2. Insiduous spiders and octopus
3. Frog with large leaves, hopefully black
4. Cockatiel with scales
5. Razorbill with wings hanging about 4 inches from one’s face and a heart tattoo on a frog
3. Cockatric interlocking tetrabods that can be blind, cut, and eaten raw:
4. Black and white desert crocodiles living in sunlight
5. Zebra and many other pea bugs

Forget the particular animals for now (don’t worry, the razorbill with wings hanging about 4 inches from one’s face will be back in your nightmares). Notice the way it numbers its list: 1, 2, 3, 4, 5, 3, 4, 5. Last week the two-year-old child who lives next to me counted 1, 2, 3, 4, 5, 6, 7, 8, 9, 11, 8, 9, 7, 8, 9, 7, 8, 9 (his mother warns this is an inexact transcription). GPT-2 is instantiated on giant supercomputers; it’s a safe bet they could calculate the square root of infinity in a picosecond. But it counts more or less the same way as a two-year old. GPT-2 isn’t doing math. It’s doing the ridiculous “create a universe from first principles and let it do the math” thing that humans do in their heads. The fact that it counts so badly suggests it’s counting human-style, which makes it amazing that it can count at all.
Logged

#### Cuzz

• Minion
• Online
• Posts: 543
• Respect: +892