Skip to Content

How to get contracted developers to read, accept and adhere to development guidelines?

Hi Folks,

as you can tell, my question is more of a process than technical nature and I'm basically looking for some ideas and how-to's.

Here is what we already have in place to make our development guidelines available:

  • Our guidelines are available online via an easy to access website
  • New developers should receive a two-page PDF-document via email with instructions where to find and how to access them. The 2nd page contains some important Do's and Don't Do's with links to the relevant sections in the overall guidelines
  • The guidelines can also be accessed via SE80/SE84 and if they don't get displayed there yet, developers get some information about how to have them show up (plus a link of where to find them online)

Where this often breaks down, is that the PDF-doc doesn't get send out in the first place and there's also no formal process to have the developer aknowledge that s/he received, read and understood the guidelines and plans to adhere to them.

Have any of you implemented some kind of process within SAP to streamline and document this? I'm for example wondering if it could work to send the PDF-document via the SAP Business Workplace as an express document to new developers, and have them aknowledge that they received and read them via response message. They'd then only get their developer keys after we received this response. Are there perhaps even some standard workflows for something like this?

Do you have other suggestions based on what is already done elsewhere?

Thanks for any feedback you have!



Add a comment
10|10000 characters needed characters exceeded

  • Thanks to all who provided answers, comments, feedback and suggestions in this lively thread!

    I just now picked Mike Pokraka's response as the "best answer" as his suggestion will be what I'm going to implement. All the other feedback and discussions in this thread also provided lots of food for thought and I have a hunch that there's ample material for a blog post or two just waiting to be written to summarise some of the content.

    If it's okay, I'll not close the discussion so that it stays open for any additional thoughts and suggestions.

    Thanks again & cheers


Related questions

9 Answers

  • Best Answer
    Posted on Feb 06, 2018 at 07:31 AM

    This is so funny, because it's one of the simple examples I dreamt up for the Workflow book (Ch. 16.3.2 in the third edition if you have a copy somewhere in your office).

    To summarize: Put a dummy task with all the information as the task text in a workflow, and use the task's 'confirm end of processing' flag to act as a confirmation. As part of your onboarding send it to them.

    But when I read "we don't have a proper QA-process (yet)" I would really focus less on guidelines and more on code quality. Time spent on this type of thing is better spent, I've seen some truly awful code that confirms to guidelines, and on the flipside there is no one size fits all when it comes to guidelines. Use code inspector and start adding the worst transgressions to your custom checks.

    Add a comment
    10|10000 characters needed characters exceeded

  • Posted on Feb 06, 2018 at 08:05 AM

    I've worked in many places. In some, developers have a good attitude and you can trust them to read and adhere to the standards. In other places, the only way to get them to adhere is to stand over them with a hand gun (I think this is called "pair" or "extreme" programming). Seriously, if you've got externals who won't do what they're told - sack 'em. For internals, you just have to do whatever the social acceptable form of beating is in your organisation. I've no time for people who won't adhere to a few simple rules - either they're not to be trusted, or they're incompetent.

    I have a good attitude, so I always make sure to read the standards and adhere to them - not matter how daft I might find some of the them. (e.g. one client who insists that all variables must be declared at the top of the code unit - well, that'll work well when 7.4 turns up...with inline declaration!)

    Peer code review is a good approach if you don't have enough resource to have a separate QA section. It also means that you've got (supposed) experts doing the work. This does rely on attitude. I remember one project I was on, where it was done diligently, as the developers realised that they could learn off each other not just the standards, but good techniques. It became a game to find violations. In another project, the developers found that the regulatory authorities were happy with a signature that a review had been carried out - so they just signed off every development without looking at them. I was in charge of all the developers across the project, and did a few spot checks and discovered this - unfortunately, the higher management team didn't give a monkey's about actual quality, so they took no action.

    I did a webinar for SCN 9 years ago on this subject, but the video appears to be no longer available. However, this blog is worth reading and you can see my comment there.

    An important factor in peer code review is that a junior reviews a senior and vice versa. That way everyone benefits and it spreads the load and programming knowledge and good practice.

    If your standards don't contain too much that's irrelevant, that will help. Concise standards are far more likely to be adhered to than a fifty page document. Don't sweat the small stuff. I strongly recommend removing anything that says that, e.g. local variables must be prefixed with l_ . Rather follow Horst Kellers guidelines in the book "Official ABAP Programming Guidelines".

    A few years ago, I got to define the standards for a client.

    All code must be:

    • Readable
    • Well modularised
    • Maintainable
    • Robust

    That was it. But that was with a small, close knit team. For a larger team, you do have to enlarge on what those terms means and give guidelines for achieving them.

    For most clients, I have to sign I've read the guidelines and standards and will adhere to them. For some, everyone now and then I have to sign I've re-read them. For one, this is achieved and recorded through LSO. In their validated systems, there are ATC checks that must be passed. This works quite well, as it has spotted some programming issues that I'd missed. If you have some processes and restrictions that are an absolute must, put them into ATC and block release of transports that don't pass the checks.

    But to my mind, for improving code quality and ensuring adherance to the rules, nothing beats a peer code review (with spot checks to spot sign offs without checking).

    Add a comment
    10|10000 characters needed characters exceeded

  • Posted on Feb 06, 2018 at 08:58 PM


    Let me start by saying that the last question you posted (and referenced here) is a remarkable example of how to use this platform, and it looks like you are keeping it up (maybe consider blogging as well, we could all benefit from it, I believe).

    Now to the matter at hand, as I second Matthew Billingham point of the importance of code review, I would also add that you need to put in place some "ego canceling" method (e.g. someone that code review is its only job and that every remark is being handled professionally and with reasoning behind it to justify it).

    As for confirming your dev. guidelines, here are some practical ways to enforce it (some I saw in previous working places and some are a figment of my imagination, but, I think, doable):

    • Have a daily system message with some part of the guidelines.
    • Have a UE\enhancement\Badi added to the SE38\80 that blocks the use of it unless they check mark the read of the guidelines.
    • When opening a user for a developer have it in place that before receiving the user and password they have to confirm the guidelines or pass a quiz on them with some score threshold.
    • Have a random programmer checked for those guidelines and email conclusions to the entire team (do that in a way you don't single out someone, did I say "ego canceling" already :) ).
    • Send the programmers (unlike my first point which sends to all the users) a message (it could be done based on this - didn't test it myself).
    • Periodic emailing.
    • Single out your weakest link / most important project and focus your code review resources on it.

    This list is, of course, incomplete and is adding (sometimes even in a tangent) to other points that were already made.

    As for adhering to the guidelines, apart from what was already been said I would add, choose the best and brightest people you can, to begin with, insist on good programmers and demand the ones you established rapport with - you are paying for it after all. Have a longer process of interviewing/allowing a contractor to work with you. This advice is a cliche by now for a reason :)

    Add a comment
    10|10000 characters needed characters exceeded

  • Posted on Feb 08, 2018 at 08:17 AM

    Reminds me of a code jam we did before TechEd a few years back...

    It completely failed when we went to demo it, but... The answer you seek is gamification!��

    Badges will surely help here... You can get badges for completing a review, finding different types of errors, responding to review comments... Ah the possibilities are endless!

    Hook it in to the branch merge (pull request in GitHub) functionality of ABAP git, and we could have an ABAP development environment that might even be fun!

    Unfortunately seems to be the sort of thing that only gets addressed in demojams.

    Good luck!


    Add a comment
    10|10000 characters needed characters exceeded

    • You could do the scoring according to least number of issues found. Of course, you'd have to carefully audit.

      When I first encountered peer review, the programmers were put in pairs and we had to review each other's code. With my partner it got very competive and overtime we had maybe a couple of findings for each program. One time I had to review someone else's code - something like 60 findings. :-o

  • Posted on May 22, 2018 at 08:30 AM

    Hello everyone,

    I see the challenges Bärbel mentioned as well in my organisation.

    One of the key points is that you HAVE the funding of the program lead to "force" these rules an regulations, but here is where it comes short from time to time. People / Consultants / Contracors state: "this will have impact on the time line". Same discussion as with #unitTests. Yes, in the beginning ... But often, this is not accepted by management. ( with short term look )

    Please: @Management: Let those geeks do their work an care for quality, similarity, and code perfection! it will pay out over the runtime of the system.

    My Devices to overcome the mentioned hurdles are:

    - Print some exemplars of you dev. rules and hand them out to every new developer.

    - try to talk to them and ask them about their opinion to those rules and if they have advice, ( after they read them )

    - use ATC & ABAP Open checks on a daily basis an talk about the result in dev. team meetings

    - review your guidelines: not to overfloat them and rewrite an ABAP Book.

    - offer information / training even to externals consultants how to use ATC to simplify their everyday work.

    - try not to have so called "developing consultants" ( Consultants who learned to type ABAP, but do not have clue about Software development ) ( Goog key question is: What would you use an interface for? )

    Add a comment
    10|10000 characters needed characters exceeded

  • Posted on Feb 06, 2018 at 06:48 AM


    I am answering your question assuming your emphasis is more on how to make the developer adhere to Dev guidelines.

    Then,the only way is

    1. Strictly enforcing code review and then let the developers take the corrective action on the review comments.

    2. Only after Step 1 is done, approve the changes to be moved to QA.


    Add a comment
    10|10000 characters needed characters exceeded

    • Got it :-)

      "Adherence to the guidelines basically hinges on the developers' good will"

      E X A C T L Y

      Other way is link it to Money.

      For every NCA ( Non Compliance ), penalise them :-)

      Cost saving and over a period of time, People will fall in line.


  • Posted on Feb 06, 2018 at 10:28 AM

    Another approach would be to use a BAdI upon Request / Release of Transport, with some kind of Checklist (ticking the Boxes for 'having done a Unit-test' and 'Read and Conformed to the Coding Guidelines' as an example) preventing Code from being "Released in the Wild" ...

    By storing and reporting on these indicators (through ad hoc Peer Review?) you might have something 'in writing' as a Point of Discussion or Reprimanding the Developer ...

    Just to be clear : we haven't implemented anything like this at my Company, but I have thought about something like this in the past. Inspiration was gathered through this Blog : link

    So even though this is an "after the facts" -approach, it could atleast make the general Compliance with your Coding Guidelines and Culture visible?

    Add a comment
    10|10000 characters needed characters exceeded

    • Thanks for your feedback, Nic!

      A while ago I "played" a bit with the BAdI you mention (ZCL_IM_CTS_REQUEST_CHECK), but for the transport creation to try and get developers to put some specific type of information into the title. These trials however never made it out of a sandbox system thus far. The aspect of running SCI during task- instead of transport-release will hopefully soon get tackled by setting up a central ATC-system with the latest version which will allow this option via configuration of the checks. Once we have this system up and running, we'll also start making use of preventing release of a task/transport with errors.



  • Posted on Feb 09, 2018 at 10:11 AM

    First of all, very interesting point and discussion, thanks for starting it. We've had the same questions over the last years and were facing the same problem of how to enforce this and how to have enough (internal) capacity for code reviews, especially in larger projects. And we've definitely not solved the issue, especially the process one, however one thing that already helps to a certain extent is the following technical piece:

    • We've started putting some of our guidelines (including naming conventions) into a Code Inspector Check Variant. We've developed some custom checks as well for things that are not available in SAP standard. I would like to mention this excellent resource on this occasion: Git repo with ABAP Open Checks.
    • Every developer is encouraged to use this variant from the start of development and not only at the very end.
    • The Check Variant is configured as mandatory on release of transport and any prio 1 or 2 finding will block the release. While externals are not allowed to release transports themselves they are still responsible for fixing their code until it can be released from DEV, so it will come back to them.
    • Developers are adapting quickly and are making sure that their transports can be released and are not blocked for some days until they are back in the office and have the business area waiting in their door because testing could not be started in e.g. the QAS system.
    • Make sure to communicate early and transparent with all developers. :)

    So far our experience with this is quite positive and we will definitely continue on this route.



    Add a comment
    10|10000 characters needed characters exceeded

    • That's a shame and I do sympathise with your difficulties. What's needed is for adherence to client development standards to be written into the outsourcing contract - i.e. if they don't adhere, they're in breach. Unfortunately, if you've management who view developers as a commodity (one pretty much the same as another), then there's not much you can do. (See Michele's blog here).

  • Posted on Mar 22, 2019 at 06:10 PM

    We are doing the following to help with this issue as follows:

    1 - We have a formal standards document

    2 - Developers are required to sign a statement saying they have read the formal standards document and will follow it

    3 - All code is code reviewed. Included as part of that is the mandatory use of the code inspector with agreed upon categories that we have documented. If the code fails the code inspector then that is marked up as such as well.

    4 - We have a spreadsheet where we track the result of all code reviews and we also include major categories as to why it failed and we review that data at every meeting. The spreadsheet is viewable by all but only modifiable by the code reviewers.

    5 - If all else fails, we have terminated a few people who are consistent repeat offenders or who we have felt isn't cutting it.

    Note - Sometimes the reviewer makes mistakes too (never happens! - lol). If the reviewer makes a mistake, here is what I recommend to do:

    1 - Admit you made the mistake to the person in an email. If applicable, let the entire team know.

    2 - Correct the data you have in the scoring system you are using (in my case, it's updating the spreadsheet that the initial review passed) and let that person know it was corrected. It will make them feel a lot better if you admit you made a mistake to them and that you corrected the data.

    We have added things to our code review process over the years as follows:

    1 - Use of the code inspector

    2 - Also checking to make sure functional spec was updated with technical details by developer if a functional spec was written for the change

    3 - We have code reviews enforced by Charm. A piece of code cannot get from development to QA unless a code reviewer has reviewed it and marked it as approved.

    4 - Track code review failure data with a group readable spreadsheet

    Code Reviewer code of conduct:

    1 - Do not make it personal. Everyone makes mistakes or forgets something once in a while. This is not a process to take someone to the whipping shed and especially for something minor,

    2 - If you bring an issue to the team where there is a pattern, there is little to no need to show who created or caused the issue. When I present issues that I want to point out, I purposely remove any and all identifying information including the name of the program where I found the issue and used as an example. This goes along with point #1 in not making it personal.

    3 - Be professional. There is no need to slam someone for something and call them names, just point it out what the issue is and if necessary what pages in the standards it violates. One reason for this is that if the reviewer makes a mistake and isn't nice about it, it really comes back to haunt you if you were wrong and it doesn't make you look good.

    4 - Be consistent and firm. If you ding something for one person, you need to ding the same thing for everyone. If you let one person get away with something and someone else got dinged repeatedly, this leads to a very quickly escalating credibility issue.

    5 - Have a consistent escalation process for repeat offenders. If a person is repeatedly doing the same thing and getting dinged on it, you can define an escalation process that first brings it to their attention and then if that doesn't fix it you can increase what you do.

    These are things I have learned over a career of doing software QA for not only software development at my current company but other software development work as well for other companies including a contracting firm.

    Add a comment
    10|10000 characters needed characters exceeded

    • Barbel - you're welcome and thank you for the compliments. My advice is just keep plugging away at it and making incremental improvements and before you know it things will have improved. When we first started doing code reviews, we didn't have a spreadsheet, we didn't have the code inspector process, and we didn't have charm. All of those things were implemented over time and we continually evaluate our processes to see where they can be improved.

Before answering

You should only submit an answer when you are proposing a solution to the poster's problem. If you want the poster to clarify the question or provide more information, please leave a comment instead, requesting additional details. When answering, please include specifics, such as step-by-step instructions, context for the solution, and links to useful resources. Also, please make sure that you answer complies with our Rules of Engagement.
You must be Logged in to submit an answer.

Up to 10 attachments (including images) can be used with a maximum of 1.0 MB each and 10.5 MB total.