If you have designed your own DC-DC converter, there will be a feedback signal so that the controller can adjust the voltage to the right level. This feedback is normally internal to the design on the assumption that the losses between the point where the feedback is taken from and the point where it is used can be ignored. However in your case, they can’t be ignored.
The principle is that you take the feedback from the target device, but this is not as simple as it sounds. You need to know how much you are losing in both the supply cable to the load, and also how much you are losing in the return cable back to the power supply. Whatever voltages you see here have then got to be subtracted from the output level at the feedback network so that the controller will raise the voltage to the correct level for that amount of current.
The bit that makes this tricky is that the feedback network is part of a closed-loop negative feedback system, and long cables and the associate time delays will make most designs unstable. Then there is the problem of overcoming the possibilities of what could occur if one of these feedback cables became disconnected (usually the output voltage rises in an uncontrolled way). The solution to this is to have a high impedance bias to the main output as seen at the PSU, but have a far lower impedance route from the remote sensing point, thus the remote point dominates when it is connected. Much the same is needed for the 0V return.
To make things worse, sensing at 4m distance means you cannot use a 4-core cable (i.e. power, return, feedback+, feedback-) because the coupling between the adjacent cores within the cable affect the feedback signals. That means that the feedback needs to come back as a separate shielded 2-core cable, and this must be well made (pigtail shield connections will ruin it).
In short, this is a non-trivial thing to design. If there is no way of moving the DC-DC power supply close enough to the point where the 3.3V supply is used, another solution is needed. The only solution left to you is simply to reduce cable loss by using Ohms law. Reducing the voltage loss by reducing the resistance of the wire becomes the only easy answer. Of course, you will need thicker cable which will cost more, but unless you’re going to redesign your power supply for remote monitoring, or relocate the DC-DC converter, there aren’t many other options.
One other option that might exist if you can change the design for the load is to make it adaptable to whatever input voltage it receives (within limits). If your circuit is a 3.3V circuit and does not have a regulator at the power input, you could modify it so that you put a SEPIC converter in the load at the point where the power is going to be used. A SEPIC converter is a type of switching regulator that takes an input that could be lower than the required level, in which case it raises the voltage; or higher than required, in which case it lowers it, or it could be just right, in which case it stays the same. These converters are readily available from Analog Devices, Texas Instruments and many other companies. Of course there will still be power loss in the cable, and power loss in the SEPIC regulator, so the actual current drawn from the supply will be higher than 2A to make up for this.
The one other obvious remark is that the SEPIC converter does not compensate for the fact that 0V at the DC-DC power supply (4m away) will not be the same as 0V at the point where the power is used. This may affect things if you have any DC-coupled signaling that pass between these points. The IR drop in the original cable is still present, and only you can answer whether that is going to be an issue or not.