Someone's already given an answer for a non-illuminated structure, but the necessary brightness of a light to be visible is also an interesting question.
We'll assume the light is located on the dark portion of the Moon. From experience, the dimmest stars clearly visible with the naked eye when right next to the Moon are around magnitude 1, which is about 3.6x10^9 photons/sec/m^2.
If we focus the light on the near hemisphere of the Earth (which has an area of 2.5x10^14 m^2) we need to produce 9x10^23 photons/sec. A green photon has an energy of around 3.7x10^-19 joules, so the total power output is 9x10^23 x 3.7x10^-19 = 333 kW.
For reference, this is roughly comparable to the wattage of the fastest electric car chargers. It's a lot of power, but well within the capability of a small lunar solar farm.
Geographically big, if you need to see it as more than a point. 20/20 human visual acuity is around an arcminute, or 1/60 of a degree. The whole moon is about a 1/2 a degree across as seen from Earth. Below that, it's a matter at how good you are at picking reddish grey out from normal grey. Red is also scattered and absorbed pretty strongly by the atmosphere IIRC.
If it's allowed to emit light, you can go a lot smaller. I'd guess an LED array with an emitting area of 10 m2 would be visible from Earth as a sort of star when in shadow, without doing any actual math on it. When competing with sunlight it becomes a visual processing problem again.
If it's allowed to focus at you, too, a 1 cm aperture should be able to resolve a single kilometer on Earth, and at a kilometer of distance you can see a candle flame if it's dark enough. It's just a beefy laser pointer at that rate.